June 8, 2017 Conferences, Uncategorized No comments
June 8, 2017 Conferences, Uncategorized No comments
Few weeks ago I went to attend a WeAreDevelopers conference in Vienna. The venue is just 10 minutes walk time from the place where I live.
The official web page for the conference is here and all of the talks can be watched at youtube here.
I will go through all of the talks I watched with a very brief message before giving my thoughts on the organization of the conference.
Opening. For such a small country as Austria a conference hosting 3500+ developers is a big deal. Chancellor of Austria (most powerful position) took the keyword and, to my disappointment, for the most part spoke in German. I think this would be fine if only this conference wasn’t advertised as an international conference. In any case it is admirable that importance of IT industry is acknowledged by the Austrian government.
Build a World We All Want to Live. This talk was a lot about the future and exponential growth. I’m afraid there not much of takeaway except of a sense of a motivational inspiration.
Challenges of Autonomous Driving. I first heard about Rimac concept car on TopGear TV Show. It was very interesting to hear about the Croatia’s concept car from engineering perspective. What I learned is that there are a lot more of a challenge that has to be addressed than it is advertised by the car makers such as Tesla, Honda, etc.
IoT & Advanced Analytics – Real World Challenges for Developers. Austrian railway company explaining how they use all kinds of detectors on their trains and how their data is analysed. This does not get anywhere close to using IA, though still interesting.
One ID to Rule Them All. This was supposed to be a presentation on identification methods (think passports, ID cards) but somehow the presenter talked about solar panels, exponential growth of technology, and near free energy. Not sure how these two play together, but the title was definitely misleading.
How Different Open Hardware is to Open Software. Interesting talk, but again misleading title. Speaker presented his robotic arm project and how it can be used by people without an arm. Takeaway: these days you can download some code, buy Arduino and 3D-print yourself an arm within hours.
The Early Days of Id Software: Programming Principles. I found zillion of youtube videos with the same title where Romero gives the same talk on different conferences. Best to watch would be this another video here. He goes through the history of Id-software in the talk. Takeaway: John Romero’s Principles for Programmers
Getting Computers to Understand Us. This was a presentation on NLP and AI that started with punchcards.
How to Design Human Centered Chatbots? Takeaway: I didn’t learn how to design chatbots but I understood that chatboat euphoria is on a downtrend.
The Future of Online Money: Creating Secure Payments Globally. This was about PayPal money transactions and security. Takeway: eliminating middleman helps in improving processes.
Less Process, More Guidance. Takeway: The Atlassian Team Playbook.
Extreme Continuous Integration. Automic company presented how they do hundreds of builds of different components for multiple platforms continuously.
Continuous Delivery Journey @ Wirecard. This was more of continuation of CI and CD topic.
Monorepos in the wild. A story of going to one repository. Some pros and cons where presented with a message that monorepo != monoapp.
Rebuilding an Aircraft on the Fly. Yet another story, now about fixing CSS by Trivago company. In my opinion this was just a common sense story.
Javascript @ Netflix. Standards of JS and their life-cycle were explained.
JS @ Uber. About programming languages & architecture approaches at Uber. Things like NodeJs, Go, Python, and micro-services were mentioned in addition to JS.
The Artist and the Machine. A lightning talk about nice spiral graphs. Takeaway: I bought my daughter a spirograph toy.
Working Backwards from the Customer. Amazon explained how they start developing with a press release. If they don’t like what is written it might not make sense to start doing it. If they like what is written and it sounds cool they add more details, create more technical translation and finally this is converted into development. I believe this was my biggest takeaway from the conference in general. In software projects it happens very often that customer does not get what they want. Starting with customer and being obsessed with the customer is probably something that makes Amazon stand out.
Model-Minded Development. A presenter from Google talked about importance of having a good model between computer and the real world.
Customizing Railways to Individuality. National Austrian railroad company talked about their challenges. They are in business for 178 years and can be considered dinosaurs when it comes to software.
Scaling Open Source Communities. Tips and tricks of handling an open source project were given. A lot of insight into OS software lifecycle.
Angular, Google’s Popular Application Framework. A relatively simple demo of Angular framework was given. Takeaway: Angular will last longer than any other JS framework. Let’s see.
PHP in 2017. This was somewhat hilarious talk by the inventor of PHP. He talked about lots of performance improvements that come with PHP7 and how that can “save the planet”. Takeaway: This blog is self-hosted word-press. I upgraded the PHP version to 7 making my contribution to more sustainable future.
Developers Are Writing the Script for the Future. Definitely the highlight of the conference was a presentation by the creator of StackOverflow Joel Spolsky. This was inspirational talk and a great way to close the conference.
Many people have complained about significant overbooking for the conference. It was very unpleasant to see organizers bragging about 3,8K attendees when there was obviously not enough room for all of them. There are some angry tweets and blog posts online, like this one WeAreDevelopers – A Mess of a Conference. I agree. It was very disappointing to find myself in crowd of people and not being able to switch tracks. Basically, if you decided to switch track you would end up somewhere standing on the side not being able to listed or to see. Second day proved to be much-much better, but who knows, maybe, this is because of some people simply decided that this conference is not worth their time?
In my opinion it was the second day that saved the conference. Better talks were delivered at the second day and more comfortable access to stages, food, and company stands was possible.
In any case, I’m really happy that such a big conference was organized in the city where I live now and that many well-known speakers from well-known companies came to talk. There was a lot to learn. I’m looking forward to #WAD2018.
February 3, 2017 YearPlanReport 2 comments
Again, not to break the tradition, here is my resolution for the year 2017.
This list is rather cryptic and unconventional. As they say, it is not that “SMART” and I agree. Fortunately for me, I’m the boss of my life and this is how I would like to put it for this year.
The road on a picture below appears to be smooth and straight. There are some shadows and lots of light in the end. Also, I don’t know if the road is still there where the light is, do you?
February 2, 2017 YearPlanReport 2 comments
It is quite late into the year 2017, but I decided not to break the tradition and write my yearly report.
When I planned my 2013 I had “the best thing that could happen” listed as I knew my daughter was coming. When I was planning for 2016 I didn’t expect to have a second child.
But here I am with my son just some minutes after he was born on 16th of December 2016.
I’m really glad and looking forward to see him grow and build his own life.
Except of this main and life changing event, few other things happened: we have moved to slightly bigger apartment, and I made some progress in my career (more on that sometime later).
Here is the list of planned things and their completion rates:
This gives me 47% overall. Apparently, I fall into the 92% category of people who fail on their year resolutions.
Nevertheless, I find this exercise of planning for a year to be useful. At least it gives a sense of things that you want to do if otherwise you are too chaotic.
I hope you all have had a good year and will have even better one this year. Happy New Year! (Yes, yes… I know – it’s February outside, but someone had to be the last to wish you “Happy New Year!”)
December 11, 2016 Personal 6 comments
Time has come to buy a new beefy laptop for my blogging :). This time I bought Dell Precision 7510.
I have a history of buying Dell laptops. You can call me a fan of Dell if you want, but really I just continue buying them because they work and I have never had any issues with them, except of when I spilled tea on my XPS 13 and had to replace keyboard and screen. I’ve made few upgrades to XPS16 (RAM, SSD, battery) and now it is being actively used by my wife for some photo editing and general home use. XPS13 in some aspects is as powerful as XPS16 and at the same time weights only 1.3 Kg. It is really easy to carry everywhere. When I bought it I said that it is “thin as “Mac Air” and powerful as “Mac Pro” but costs less”. Unfortunately over time I could not feel very productive on it. Even though I could do everything I needed, I couldn’t pleasantly run heavy IDE or VMs or play games that required dedicated graphics. It felt like I needed a proper workstation.
Decision making on a new workstation went terribly wrong. I spent around 8 hours comparing options:
I was seriously considering desktop PC instead of laptop, but eventually leaned towards powerful laptops that can be easily docked if needed. I was choosing between different Lenovo and Dell (yeap, no Mac). I stopped on Precision 7510 because it is real working station. It comes with thunderbolt interface, it is highly configurable and it is a brand I used for a long time. Another reason for choosing Dell was pricing. Since I was buying at dell.at as a small business I was able to customize my purchase very granularity: removed unnecessary support and useless stickers, selected Ubuntu OS and cheap delivery – something Lenovo was not offering. As of hardware I have chosen to reasonably max those things that I’m not going to upgrade (CPU, GPU) and leave room for other upgrades (RAM, HDD). I didn’t choose 4K touch monitor, as I don’t think it makes any sense on 15″.
Here below are some specifications for all of my Dell Laptops:
Dell Studio 1535
|
Dell Studio XPS 1647 |
Dell XPS 13 |
Dell Precision 7510 |
Intel® Core™ 2 Duo T5850 2.16GHz | Intel® Core™ i7-620M (Prev Gen, 2 Cores, 4 Threads, 4M Cache, up to 3.33GHz) | Intel® Core™ i7-3537U (3rd Gen, 2 Cores, 4 Threads, 4M Cache, up to 3.1GHz) | Intel® Core™ i7-6920HQ (6th Gen, 4 Cores, 8 Threads, 8M Cache, up to 3.80 GHz) |
LCD (1280×800) | 15.6″ FHD Widescreen WLED LCD (1920×1080) | 13.3″ Hi-Def (1080p) True Life WLED Display W/1.3MP | 15,6” UltraSharp FHD IPS (1920×1080) |
DVD Super Multi | 8X CD/DVD Burner | – | – |
2GB DDR2-667 | 8GB Shared Dual Channel DDR3-1333MHz
(originally 4GB) |
8GB Single Channel DDR3-1600MHz | 16 GB (2 x 8 GB) DDR4-2667 MHz (two more slots available) |
320GB 5400RPM | 256GB SSD (originally 512GB 7200RPM) | 256GB SSD | 256GB M.2 PCIe SSD (I added a second 512GB 7200RPM HDD) |
ATI Mobility Radeon™ HD 3450 | ATI Mobility Radeon™ HD 5730 1GB GDDR3 | Intel HD Graphics 4000 | Nvidia Quadro M2000M 4GB GDDR5 |
High Definition Audio | High Definition Audio 2.0 with SRS Premium Sound | Wave Maxx Audio | Some Audio |
Dell Wireless 1397 WLAN Mini-Card | Intel® 5300 WLAN Wireless-N (3×3) Mini Card | Killer Wireless-N, 1202 for Video & Voice w/ BT 4.0 | Intel® Dual Band Wireless-AC 8260 |
56 WH, 6 cell, LI-ION | 85 WH, 6 cell, LI-ION | 47 WH, 6 cell, LI-ION | 72, 6 cell, LI-ION |
Bought late 2008, alive and used by Mom for Skype, audio-jack bad, battery dead. | Bought Sep 2010, heavily used, became loud, upgraded with RAM and SSD, battery replaced. | Bought Nov 2013, actively used, no upgrades, screen and keyboard replaced because of tea spill, battery completely healthy. | Bought Dec 2016, using it right now, added second HDD, planning for more RAM when time comes. |
I have ran benchmark software on XPS16, XPS13, and Precision. While 16 and 13 were somewhat comparable, Precision speed rocked. CPU speed was 2X of 3X faster depending on calculation operations (floating, integer). GPU speed was 14X as compared to XPS13 and 3X as compared to XPS16. RAM was 3X of XPS16 and 1.5X of XPS13. SSD write speed was 2X of both.
Lots of numbers, but I can simply feel the difference. It is a pleasure to use a fast machine. Who knows what my fifth column will look like.
November 27, 2016 AutoMapper, Opinion 15 comments
AutoMapper is a great little library every .NET project is using (well, lots of them). I used it for the first time in 2010 and wrote a blog post about it.
Since that time I observed few things:
CreateMap
and Map
are still there and work the same. At the same time performance, testability, exception handling, and feature richness got improved significantly. Last one, in my opinion, is not such a good thing as it leads to the next point.AfterMap
or in different kinds of resolvers would simply start containg crazy things. In worst of those cases actual business logic was written in resolvers.I have always been of an opinion:
Less Code – Less Bugs; Simple Code – Good Code.
Having seen this trend with the library, I would like to suggest simplifying its usage by limiting ourselves. Simply:
ForMember
method it may be the case for doing it manually (at least for the specific type) – it will be cleaner and less confusing.Profile
class and Mapper.Initialize
method. If you still want to have at least some abstraction to avoid referencing AutoMapper everywhere make it simple.Here is how I’m using AutoMapper these days:
Somewhere in CommonAssembly a very-very simple abstraction (optional):
Somewhere in BusinessLogicAssembly and any other where you want to define mappings (can be split in as many profiles as needed):
Somewhere in startup code in BootstrappingAssembly (Global.asax
etc):
And here is the usage:
That’s it. I do not understand why some simple things are made complex.
There is also another advantage of keeping it minimalistic – maintainability. I’m working on a relatively new project that was created from a company’s template, as a result it had older version of AutoMapper abstracted. To upgrade it and keep all old interfaces would mean some work as abstraction used some of the APIs that did change. Instead I threw away all of these abstractions and upgraded the lib. Next time upgrading there simply will be way less code to worry about.
Please let me know if you share the same opinion.
November 16, 2016 git, Tools No comments
Imagine working on the same code base in two disconnected networks. How would you synchronize your repositories using rudimentary storage device, like a USB-stick?
Undeniably for such a synchronization there could be multiple solutions starting with very primitive manual copying of cloned repositories finishing with some specialized devices and synch processes.
I came up with something intermediate, until the situation with the setup of project changes.
Idea is very simple:
1. USB-sharing device, so that USB-stick can be shared with a press of a button (physical in this case)
2. git bash script that does the following:
3. A task to trigger the synch script when USB-stick with bundle is connected (I do not have this one yet, but it is a next logical step)
If two repositories were available at the same time the same script (with modifications) could be used to synchronize them on schedule or trigger event.
Here is the code of the script:
I also make it available on github under MIT license. Hopefully it comes in handy.
November 14, 2016 Opinion No comments
Just recently I joined a team. We write intranet web application. There is nothing too special about it, except that it was designed to be implemented as micro-services and as de-facto at the moment it is a classical single .NET MVC application. This happened for a simple reason: meeting first release deadline.
The design was reflected in how source control was set up: one git repository per each service. Unfortunately this caused a number of required maneuvers to be in synch and to push changes as team was making scattering changes in multiple repositories. This also made it more difficult to consolidate NuGet packages and other dependencies as all of them were in different repositories.
I think that microservices and corresponding hard reflection of their boundaries in form of source code repositories should evolve naturally. Starting with a single repository sounds more reasonable. If you keep the idea of microservices in you head and nicely decouple your code nothing stops you creating new repositories as you service boundaries start to make shape.
Taking this into account we merged repositories into one. There was only question of keeping source code history. Turns out the history can be easily preserved by employing git subtree
command and placing all of the service repositories as subdirectories of a new single repository.
As a result, team is working much more effectively as we do not waste time on routine synch and checking who did what where.
Conclusion: Theoretically micro-services should be implemented in their own repositories. That’s true, but in practice for relatively small and new project, with only one team working on it, single repository wins.
October 30, 2016 Book Reviews No comments
A black swan is a highly improbable event with three principal characteristics: It is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was. The astonishing success of Google was a black swan; so was 9/11. For Nassim Nicholas Taleb, black swans underlie almost everything about our world, from the rise of religions to events in our own personal lives.
I have listened to the audio version of “The Black Swan” twice. First time at the beginning of the year and the second time just recently. The book is philosophical in a way. It is not very easy to fully comprehend conveyed message as author frequently diverts to fictional stories, terms in French, and thinkers that are long time dead.
There were two striking statements in the book “anyone can be a president” if someone like “these people can get a Nobel prize”. Sounds actual? Think of Trump vs. Clinton presidential race and Bob Dylan receiving Nobel prize in Literature if you are not reading this in Autumn 2016.
This does not mean that Nassim Taleb is any sort of predictor or prophesy maker. He himself says that he cannot make predictions, instead he highlights over and over again that rare events that seem improbable do occur more frequently than most of us would imagine and at the same time it is impossible to come up with mathematical models that would somehow calculate probabilities for these events. Unfortunately, we cannot know what we don’t know, therefore the best strategy for any of us is to build robustness to black swan events.
Application of the ideas expressed in the book is very wide. Starting with building financial portfolio consisting of 90% of very safe investments and 10% of extremely risky ones, therefore exposing yourself to probability of catching a black swan, like Google or Facebook. Ending with applying it to your life by exposing yourself to variety of endeavors. Careful here: event’s consequences are even harder to predict than occurrence of such events.
There is one aspect of the book that I don’t like. The author almost throughout the book despises other people imagining them as aggressive apes and suggesting nasty things like putting a rat down someones shirt. I do not exclude that he imagines his readers in the same way: silly monkeys reading higher caliber philosophical work. This, though, does not disqualify his book from being a really valuable contribution to human knowledge, but, in my opinion, it is only thanks to the black swan event of him benefiting from the 2000 crisis that made him successful and subsequently allowed him to write this and other books.
This book is definitely worth reading. It may make you look at the world as sequences of improbable events that change everything around. It could also make you way more skeptical about the theoretical modeling suggested by economists and other tie wearing experts. The book is not an easy read. On the contrary, it requires a lot of attention and thinking. Maybe leave it for a time when you are in a “philosophical” mood.
I could not find an example showing how to use Microsoft.Extensions.DependencyInjection as IoC in OWIN Self-Hosted WebApi, as a result here is this blog post.
Let’s imagine that you have WebApi that you intend to Self-Host using OWIN. This is fairly easy to do. All you will need to do is to use Microsoft.Owin.Hosting.WebApp.Start
method and then have a bit of configuration on IAppBuilder
(check out ServiceHost.cs
and WebApiStartup.cs
in the gist below).
It becomes a bit more complicated when you want to use an IoC container, as OWIN’s implementation takes care of creating Controller instances. To use another container you will need to tell the configuration to use implementation of IDependencyResolver
(see WebApiStartup.cs
and DefaultDependencyResolver.cs
). DefaultDependencyResolver.cs
is a very simple implementation of the resolver.
In case you are wondering what Microsoft.Extensions.DependencyInjection
is. It is nothing more than a lightweight DI abstractions and basic implementation (github page). It is currently used in ASP.NET Core and EF Core, but nothing prevents you from using it in “normal” .NET. You can integrate it with more feature-rich IoC frameworks. At the moment it looks like Autofac has the best integration and nice documentation in place. See OWIN integration.
Rest is just usual IoC clutter. See the gist
I hope this blog post helps you with integrating Microsoft.Extensions.DependencyInjection in WebApi hosted via OWIN.
October 4, 2016 EmberJS No comments
Obviously there are general upgrade guides provided by Ember team and many fellow bloggers. This is just to document one of the experiences with upgrading from ember 1.7.0 to 1.13.13.
At the moment of this writing Ember latest stable version is 2.8. My team was one of the early adopters of Ember. I believe the team started incorporating it in late 2013, which is very soon after the 1.0 release. We went live with the version 1.7 at the beginning of 2015 and since that time we didn’t do any updates for “penny wise and pound foolish” reasons.
Upgrade itself was a bit of pain as it spread for couple of months. We allocated few days per sprint and at the same time continued developing new features the old way in other branches. Bad idea.
Another pain was that we adopted Ember Data while it was still beta. As a result we have custom code altering adapter’s and serializer’s behaviour. There are many breaking changes between beta versions of Ember Data, so using it while in beta was a very bad idea.
One of great things about being on Ember 1.13.13 version is that you are effectively on 2.0.0 version unless you have deprecation messages in your console. This also means that you can still release your application with some parts not being completely converted to the new way of doing things. ember.prod.js
doesn’t generate warnings and works just fine. I really like the way Ember tries to make upgrading easy. Here is a nice write up on handling deprecations as of 2.3.0.
This list is composed from notes I took so it is not very well organized and does not contain all of the items we had to fix.
“Ember Data cannot read property 'async' of undefined"
pushObject
with addRecord
to fix “You looked up the relationship on a with id but some of the associated records were not loaded.”
Ember.Handlebars.helpers.render.call(this, name, contextString, options)
Ember.set()
whenever there was controller.isNew
property with some setHTMLBars
instead of Handlebars
by incorporating ember-template-compiler
template = template.replace(/\uFEFF/g, ''); // remove BOM
{async: false}
MetamorphView
. Fixes this: Assertion Failed: A fragment cannot be pushed into a buffer that contains content because of: view.createChildView(Ember._MetamorphView, {
At this point you start to get tons of deprecation messages.
`someProperty` from `<App.SomeXyzController:ember2661>`, but object proxying is deprecated. Please use `model.someProperty` instead
.{{action bubbles=false preventDefault=false}}
to this {{action "ok" "close" "cancel" bubbles=false preventDefault=false}}
Cannot call `compile` without the template compiler loaded. Please load `ember-template-compiler.js` prior to calling `compile`
. Ember.HTMLBars.compile(submitHtmlTemplate);
doesn’t produce a correct function to retrieve HTML. Can be solved as in this SO answer.@each
at the end of a computed key is deprecated and will not work in Ember 2.0Ember.Enumerable.mapProperty
with mapBy
Ember.Handlebars.helper
with Ember.Helper.helper
itemController
I hope this comes in handy for you.