Show Blogger Panel Hide Blogger Panel
Alex Yakunin

November 11, 2011

I'm back :)

I'm happy to announce I'm returning back after almost 1 year long absence here ;)

So what's changed? Actually, a lot.

In December 2010 I decided to start a new software project. My primary goal was to build a product that must be attractive for a wide set of categories of people. So after some period of research (I'll tell about this later) I came to the following idea:
  • There is no single place on the web, where you can compare yourself with other people by the whole set of metrics that are relevant to you. I'm talking about nearly any metrics - e.g. your birth date, height, the number of hours you spent at work during this month, the highest place you've ever visited, the deepest dive you've made, the number of miles you run this week, your reputation on Stack Overflow and so on.
  • Most of the ratings on the web are global, but local ratings seem more important in our life. E.g. what my ~ 2.5K reputation on Stack Overflow says about me in general? Nearly nothing. But being compared with reputation of developers in my city (Ekaterinburg, 1.5M population) or in some company here it becomes more meaningful: Stack Overflow is in English, so local developers rarely answer there. The same can be said about nearly any metric: almost no one remembers his position in global rating on Stack Overflow, but I bet you'd definitely remember you're #1 (#2, #3) on Stack Overflow in your city. 
  • If we're able to compare an arbitrary set of such metrics for arbitrary groups of people in real time (e.g. your productivity measured by RescueTime + the number of LOCs you committed + the number of tasks closed in issue tracker), we bring Google Analytics to the world of achievements and productivity of people. This must be a precious tool for business - at least, I'd definitely use such a tool. And it is attractive for people as well, since they can identify and highlight their local or global successes (e.g. attach them to a CV). Moreover, I'd use such a tool to compare and fix successes of my kids.
  • If we're able to fix nearly any facts, we can identify the most popular facts in a particular group of people. And this set of facts must nicely describe what's common in this group. E.g. if it's a group of bodybuilders, most of them publish the weight they push, muscle size, etc.; if it's a group of developers, they compare their reputation on Stack Overflow, number of committed lines of code, lists of technologies they use, etc. Travelers might count the number of countries they visited, driving fans - the horsepower of their cars, and so on. Imagine, you're visiting some event and instantly see the most popular activities and achievements for the whole group of people there. IMO, that's cool!
Home page of 9facts.com displays the topmost
achievements in your country, city and groups
you're in, including your friends' group.
So that's what 9facts must finally evolve to - a tool allowing you to share and compare arbitrary (but mainly measurable) facts about your own and other people. Currently we're just in the beginning - "About 9facts" page fully describes the current state of the project; the limitations it has now include:
  • Absence of an API allowing third-party apps to push facts with measurable data to it, although there are few integrated data providers grabbing some facts from Facebook, Twitter and Stack Overflow. Also, you can enter the facts manually there - our presentation explains what's so unique in this part.
  • We can't log facts with data sequences (i.e. data change charts).
  • Some of essential social features aren't implemented yet - e.g. there are no comments, messages, notifications about your high positions in tops and so on.
  • UI/UX certainly needs to be improved as well - some actions and implied results there are far from being obvious.
But the good thing is that it already works, and you already can compare yourself with the people nearby you (or in your city, country and so on) right now. You can do this in 3 steps:
I'd highly appreciate if you share your impression with us. There are several ways of doing this: create a fact about Alex Yakunin on 9facts, send an email to [email protected] or leave a feature request @ UserVoice.

More information about the current state of 9facts is provided in my post describing the latest update.

So what I'm going to write about? We've learned a lot during last 6 months:
  • 9facts is an independent startup. X-tensive acts as co-investor here, but a part of money was raised from another investor. So I think I'll be able to tell a story about this, if / when we'll raise the next amount. 
  • We've built a new team for this project in pretty short time, so in particular, we should quickly enforce agile development, coding standards and practices. I can't say we fully succeeded with this, but I'm happy with the result.
  • We are using DataObjects.Net in this project, so it's a sort of "eat your own dog's food" experience for me. We use DO in almost all the projects we run, but personally I didn't have any remarkable experience with ASP.NET MVC before 9facts, so now I know quite well how to use DO efficiently in this scenario. Splitting the application into tiers, using federation (sharding), performing schema and data upgrades, using caches, running certain actions periodically - these are just some topics I can cover.
  • There is a JavaScript library plus certain amount of server-side code allowing us to handle AJAX requests (we don't sent regular POST requests at all), errors and many other things in unified fashion. Our ASP.NET MVC handlers processing form update requests (actually, any AJAX requests) usually contain much less code then you usually see in examples handling standard POST requests, although they are much better ready to real-world scenarios (in particular, error handling). Take a look at this page's network traffic to understand what I mean. I can add that normally we don't use client-side MVVM (although we use knockout in few cases), i.e. the rendering is usually performed on the server side. IMO, that's the most convenient scenario for ASP.NET MVC applications.
  • We're extensively using Task Parallel Library (TPL) and PLINQ, so I can share some experience related to this part as well.
  • Finally, it might be interesting why I decided to develop something new, how I was peeking the ideas and so on.
That was a brief list of topics I'm going to expand during next months. I hope you'll like this new kind of content in my blog.

November 16, 2010

Right way to buy DataObjects.Net

Today we've got a nice order: some guy has ordered DataObjects.Net using pretty old coupon code (upgrade from v3.9) providing 60% discount. I tried to find it in Google, but failed - i.e. there is no info about it at our web sites. On the other hand, promo code was really enabled, so discount was provided.

Initially I was thinking about asking him to make a refund and use one of actual coupon codes. Obviously, this idea was really bad - in fact, it's our own mistake, he just found it (probably, occasionally). And Alexis Kochetov proposed a better idea: we must give him additional 10% discount. So we'll prolong his subscription for additional 2 months :)

Of course, the gap is closed now :)

November 1, 2010

Link to Jon Skeet's post about new "async" and "await" keywords in C# 5

I just found a great post explaining how new "async" and "await" keywords in C# 5 really work.

Summary: likely, you know that LINQ, from compiler's side, is actually a language-integrated sequence monad; so new "async" and "await" keywords in C# 5 bring another one - a language-integrated continuation monad.



September 1, 2010

When (0+x)*1 != x, or dealing with legacy data

I just remembered, that in the beginning of summer Alex Ustinov has told me a nice story about legacy data import. That time he was fighting with importing legacy data from some old .DBF database using SQL Server Data Transformation Services.

He said, "I tried to pull out the data column directly “as is”, but failed - the data type of column value returned by provider sometimes differed from the expected one. Probably, that was a result of some a bug in the data provider I used, or... Anyway, I immediately remembered that in order to convert the number to an integer data type in VB, you can simply add 0 (zero) to it. So I did this, and discovered that data import was almost fully fixed: the new script was capable of importing all the rows but one! I repaired this by multiplying  the expression by 1 :)".

I instantly imagined the following code:
var target = (0 + source) * 1 ` My friend, believe me, this is necessary!

It's ridiculous, of course, but in fact such decisions are quite frequent, if we speak about legacy code. It’s easier to work around the issue rather than understand what kind of bug is there (in DBF file itself, or in data provider – who knows?), taking into account there is almost no issue-related information about either. All this stuff was born before the Internet became really popular – precisely, a million years ago.

P.S. I’m retelling this relying only on my own memory, so there can be some terminological mistakes. Thanks god, I didn't deal with VB long enough to forget it :)

August 17, 2010

Concurrency problem :)

Imagine we have a concurrent integer counter, that doesn't support "Increment and read" operation, but supports two separate operations: "Increment" and "Read". Can it be used to generate sequences of unique integer numbers concurrently, and if so, how?

To be precise, I want to write a program involving only such counter for concurrency control, that, being executed concurrently, would produce a sequence of unique numbers in each thread. I.e. I want to ensure no one number is shared among these sequences.

The answer is here (although I'm not absolutely sure it is correct), but I'd suggest you to solve the puzzle by your own first.

Initially it seems the problem is far from being practical - nearly any implementation of concurrent counter supports "Increment and Read" operation. But it looks like we've found one that doesn't :)

June 10, 2010

Highly recommended application: PivotalTracker

Take a look at it - it's simply a great tool for agile developers.

We started to use it few weeks ago, and, even while there is no good result from the point of planning and predictions yet (at least for DataObjects.Net project), it is really useful anyway. Its main benefit for me is that I'm always know what's important right now \ what are the tasks I should spend my time on. When you take part in a set of projects + have certain amount of correspondence, it isn't always easy to do this. Finally, it's quite easy to re-arrange the stories there and see what happens.

PivotalTracker rules.

May 15, 2010

DataObjects.Net v4.3 RC1 with Visual Studio 2010 and .NET 4.0 support is available

See this post for details.

P.S. We've finally decided to publish all the posts related to DataObjects.Net in its own blog - this must be definitely more convenient for the people interested in it. The idea of spreading all of such posts in personal blogs was deeply wrong :( So further I'll be blogging about everything except DataObjects.Net itself here.