Tuesday, January 20, 2015

Enterprise Software & Teams

I hardly get off the feeling that software development is just an household activity, just the routine work. It is much more than this.  The potential of innovation is probably more than any other field, if we compare different fields of engineering, per se.  The software could be as small as just two lines of the code to extend already written functionality or it could be as huge as a product distributed across different geographical locations, on different data centres, integrated with multiple systems hosted on various platforms, and last but not the least, a huge team handling the operations effectively and efficiently without hampering the business continuity.

I started using few good tools to enhance my experience working on microcomputer softwares.  The typical software development may include the usage of the following tools to improve the developer's experience working on the product. Let's get into the details of these tools.

  1. Integrated Development Environment (IDE) :  It’s a tool that helps us to create our product ecosystem.  If you are a windows operating system fan then Microsoft Visual Studio.NET would be the good choice.  IDE helps you to work on the product seamlessly. You can navigate to the files, modify the files, build the product, get the binaries, deploy to the application server like IIS, and many more.  I am not the linux fan hence I do not know more about linux development tools.
  2. IDE Extensions : These are the small plugins that enhances the features of the IDE.  These are written to automate minor and basic activities that the developer of the product uses in his daily lives. The extensions helps us to work on the product while reducing the chances of the errors and failures. These extensions can generate a lot of code for you with just a mouse click.
  3. Servers :  I use the build server, application server, and database server while working on the product. These servers helps me to distribute my application load across various machine boundaries thereby giving me the flexibility to change/upgrade the product.  Sometimes I use FTP server to upload my product deployment binaries.
  4. Cloud Storage :  I use the cloud storage for storing my content of the product.  The content, essentially, means that I am storing all the static resources over cloud.  It is more economical for me to go with this option because there are every chances that my application may grow over the period of time and migrating to cloud looks more convenient for me to maintain my product 24x7
  5. Profilers :  I prefer the profilers to profile my product. It gives us the ability to find if there are any bottlenecks or memory overflow while running the applications. These are very matured tools and it is a great feeling working with them.  It gives us the advantage of finding the bottlenecks at the very early stage and can help us to deploy high quality production build.
  6. Emulators :  These are used to simulate the environment that is separate from the standard computer device.  e.g. it could be a tablet or phone, it could be a ATM kiosk, or it could be as complex as a flight cockpit.
  7. Messenger & Mail client : These are helper tools, though not mandatory, to stay in touch with your office friends!


The tools form a great portion of what we invest in the product development.  Apart from that, we have teams!

The software teams can be defined as the group of highly skilled and educated members that can cater the good value to the product. Just like team is very important for the product development, so do its composition.  The team is said to be healthy when all the skills required to execute the product is present within the team and they have a good rapport amongst each other. The concept of the back-up teams is also a good idea. The purpose of the back-up team is to provide their service in case the main team is not be able to hit the bull's-eye.  The team can be distributed as well as co-located based on the requirements.  The communication channel is also an important aspect of the team and so do their recreation activities within office premises.  The motto of the team is to grow individually as well as a group, personally as well as professionally.

The process to execute the software development and maintenance is another good aspect to look at.  Generally, softwares are developed and maintained in small chunks and hence the feature definition keeps changing.  The prototyping of the product can be easily done and it can be evolved over the period of time to release a fully functional product. The process needs to be more agile in the true sense. The agility is required to decrease the time to market and total cost of the ownership as well as making the teams to actively handle the change requests quite faster than they ever did before.  The team gets the good understanding of their partner's business and their prospective growth while working closely with them.

Since we have a human factor involved, it becomes very important for us to tackle their growth and development.  Good recreation activities, hangout time, hobbies time is a good way to relax the team off their work. It is an individual choice to take the decision about their recreation.  It's all about good time spent together, even though it is only for the few moments!


There are lot of changes we see around us and software industry is no exception to it.  We see the range of the tools available that can make the life of the developer's easy and we can certainly opt for a valid license copy.  At the same time, the standard of the education has also been increased so the industry can reap the talent of fresh engineers who has the capabilities and interest to build their career in the field of the softwares.  More great stuffs are yet to come and we should be very well prepared for it!

Friday, December 5, 2014

Customers are friends!
Yes, you read it correctly. Our customers are one of our best friends. I have been working with many clients across the globe from different countries and continents, and it is very obvious because I belong to one of the global industry and found a great characteristics while working closely with my customers and partners. They are friends!
The art of making friends and working with a joyous heart for them make them feel they are very close to us. As a customer their primary aim is to build a good rapport with the team, an unknown team to some extent. I used the word "unknown" because the team may be from different culture and nationality and hence there is some X-factor in the team. A good rapport building is certainly a trick of a veteran but anyone who has the potential to make the difference can achieve it. I want to explicit the fact that this blog post is purely my understanding about customers and it is independent of any group or organization as a whole.
When I was in Bangalore, I happened to visit a seminar of ClearCase. For those who do not know, the ClearCase is the software to maintain the versions of the corporate documents. The seminar was free for few privilege customers and the arrangement was awesome. No doubt, the central idea was to promote the product usage but I was very happy the way their team handled their customers. Their main guests were their competitor product vendor. First and foremost good about them was that they were handling customers queries very proactively (no talking behind the back scenes) and were looked passionate to answer them. I wonder we work in such a competitive world and still there is no negative competition for their competitors products. I kept on thinking about it and found that they must be looking for few critics about their product and hence they must invited them.
Customers are friends and we realized this when they try correcting us at some point of time. It, not only, helps us to grow professionally but we get a feeling of being corrected on the positive note. The discussion can be very casual while having a cup of tea with them in the meeting or playing few games with them. The essence of the discussion lies in few good and positive remarks during this time. When our customers chooses us as a vendor, they not only give us the responsibility of their product/business, but they also want to become part of our daily lives. It could be as basic as taking a lunch with them or cracking an intelligent joke in the meeting or as advanced as giving them good suggestions that can help them to revise their product roadmap.
There are few advantages of getting close to the customers. When we are successful in building the trust with them, they may help us to grow along with them. They will try to build the relationships that may be far more better than what we could have achieved using external industrial certifications and accreditation. They certainly need a helping hand and gradually they feel that we are part of their team and the relationship grows well! All the negatives that can hamper the business can be eradicated and we channelize ourselves towards a positive relationship and that's where our real success lies. Our real success lies in the success of our customers and the ability to improve their business and at the same time we become integral part of their team and work very closely with them and gradually we realize that we are capable to take decisions on their behalf which makes us their partner in a true sense.

Sunday, March 17, 2013


Rackspace Cloud - What lies beneath?

Nowadays, the term 'Cloud' no more refers to a cloud that brings rain in the season of monsoon!  It also refers to the software technology that brings a lot of benefits over the traditional and conventional methodologies of hosting your software applications.  I consider myself fortunate to grab the opportunity to work on Cloud technologies and have explored each and every aspect of the cloud offerings in its entirety.  Generally speaking the most of the Indian software companies work more on providing consultation and services to their business partners. There are very less organizations in India that has got the privilege    to make decisions for their business partners (clients).  This is fortunate as well as unfortunate in its own sense.  This can be viewed as fortunate because the organizations can generate regular income by serving their IT partners. It is unfortunate because these companies are highly dependent on their client to make decisions. They hardly get chance to explore any technology to its deepest possible level and come up with their own proposals. This is because their clients have already done the background work and give them the checklist which they have to follow.  In my view, it restricts the thought process.  Well, it’s a separate topic of discussion. :)

Like I previously said that I am fortunate to grab an opportunity to explore cloud technologies in its entirety is only because it was my personal craving to explore the word 'Cloud'.  I was part of the team who were using Rackspace Cloud for their daily deployments and maintenance.  Sometimes I wonder why word 'Cloud'?  I thought a lot over it and finally got the convincing answer from my mind that Cloud is seen from everywhere and from any part of the world, hence the creators might have used the word 'Cloud'!  This is analogous to the way we expect our Cloud-based applications to perform.  The owner of the application expects that the application should be accessible from any part of the world and should be up and running most of the times without putting more efforts in its maintenance.  I am writing this post to enable our community to make most of the cloud technologies. I will be referring to Rackspace Cloud unless explicitly mentioned.

For those who do not have any background about Rackspace Inc., it is the firm that provides cloud infrastructure for their customers.  They have great features embedded in their control panel application to administer and monitor the cloud based applications.  Just like any other technology, cloud also follows specific sets of RFC's.  RFC (Request for Comments) becomes the input for any provider to implement the new technology so that consistency is maintained between the implementations of different vendors.  The taxonomy and terminologies may differ a bit in its name but they help us to achieve the same functionality across different vendors.  Some vendors may like to take a patent or register the name of specific implementation to uniquely identify their brand name. e.g.: the design and shape of the grills near the bonnet of the cars are patented by their manufacturers so that anyone can identify the car manufacturer by looking at the design and shape of the grills!

The cloud can be viewed as the large set of Virtual Machines (VMs) that are spread across the world using millions of GB of space and many processors.  It, truly, works on the concept of virtualization where in the storage is spread across many computers across the world.  The allocation of the CPU is done on the need basis and so is the hard disk space.  Hence the main unit of billing is the usage of CPU and hard disk space used.  These are virtually known as Computing Cycles (cc) and Storage. We will discuss about this in more details towards the end of this post.

Let's jump back to the Rackspace Cloud.  I assume the reader of this post has the basic knowledge about deployment of .NET based application on local IIS servers.  The Rackspace Cloud provides many features that a deployment engineer can make use of.  The main amongst them are Cloud Hosting, Cloud Files, Cloud Servers, Load balancers.  Let's discuss one by one.

  1. Cloud Hosting:  This feature enables us to create the instance of our website.  When we want to host our website on cloud we would like to select this option.  The selection of this option depends on the web-server you may like to use. e.g.: for .NET based applications you may like to use windows or IIS, for PHP/Java you may like to go for Apache.  The deployment engineer needs to make an appropriate choice as to which platform the application is targeted. The engineer then creates the instance of the website and creates the published copy of the website.  The published copy of the website contains binaries (dlls) and the views (.aspx) pages.  The Rackspace recommends to ftp upload the contents of the published folder into the specific folder named 'Content'.  Once the content folder is stuffed with the binaries it takes sometime for the cloud engine to reflect the changes.  Meanwhile, if you are an IIS user then you must restart the application pool by using 'Rebuild Application' option available in Rackspace Control Panel Console.  The 'Rebuild Application' option gives user the ability to force restarting of the application pool so that new binaries can be loaded in the process and changes gets reflected on the website soon.
     
  2. Cloud Files:  This feature enables the storage of the files over cloud.  The Rackspace allows logical grouping of the files which are often known as Containers (Amazon S3 call it as Buckets).  The containers are the virtualization over the physical hard drives meaning the single container can spread across multiple hard disk drives.  It also serves as the unit of isolation meaning the contents of one container cannot be overwritten/overridden by the contents of the other container.  These are independent of one another.  The cloud vendors also provides the API key to programmatically manipulate the files. The developer or a software writer can upload, download, view, delete the files programmatically using the API key. The programmatic access follows all the basic rules for cloud access including the concept of the containers (and buckets in Amazon S3).  The cloud works on the principle of CDN (Content Distribution Network). This enables the cloud engine to generate a unique path for all the files that are uploaded to the cloud.  The file can be accessed from anywhere in the world over the internet using the "cdn" path.
     
  3. Cloud Servers: This feature enables the user to create the instance of the servers in the Cloud.  Conventionally, the vendors have servers in their own premises.  This setup may need an additional efforts to maintain the server.  The organization may need to design or draft the backup plans, in case, the server gets crashed.  The Rackspace frees us from all these worries.  This facility enables the user to create the instances of the server on the cloud and use them round the clock.  The overheads are taken care by Rackspace.  Rackspace provides several options to create the instance of the server. It can create the server with any platform (all flavours and versions of Windows and Linux).  There are several pre-defined images of the servers available in Control Console that facilitates easy and faster creation of the new instance of the servers

  1. Cloud Load-balancers: This feature enables the user to create load balancers for their servers.  I need to explore this feature in more details and will update the required information in this post once I get enough competency in this section.


The metrics that are used by Rackspace for the billing is Computing Cycles, Storage, number of Server Instances. Let's discuss these terminologies in detail.  These are the basic metrics used by Rackspace to generate the invoice for its clients and users.

  1. Computing Cycles:  The cloud works entirely on the concept of Virtual Machines.  These VM's uses several CPU (Processors) that can spread across different machines. Hence, there is a need to Virtualize the concept of amount of CPU usage. This is known as Computing Cycles (cc).  It is defined as the amount of total CPU cycles used by the user in performing the administrative task from the Control Panel of Cloud.  It is the sum of all the CPU cycles from all the available CPUs.
  2. Storage:  The cloud uses the concept of Virtual Storage to identify the amount of storage the specific deployment of the  application uses.  It also includes the amount of storage used by the Cloud Files.
  3. Server Instances:  The total number of servers created by the user and platform information of each server

The above information can be accessed by the user at any point of the time and the invoice is generated based on the usage. There are several plans available with Rackspace including pay-as-you-go and unlimited usage.  The unlimited usage gives 500GB of Storage and around 19,000 Computing Cycles (cc) per month (These numbers are not final  and are likely to get change as decided by Rackspace from time to time).  The cloud vendor guarantees 99.99% uptime and zero maintenance overhead with great disaster recovery mechanism. This is quite true as per my experiences working with Rackspace.

Cloud - Manage at one place, access anywhere!

Tuesday, February 5, 2013

|| Kyuki Gyan hi aapko apna asli hak dilata hai (Because only Knowledge can enlighten you) ||

Gyan - a four-letter word - your eternal voice.  Gyan or knowledge enables us to achieve something great. Gyan - to know where we want to go. Gyan - to communicate. Gyan - to achieve the highest abode. Gyan - to make ourselves feel confident. Gyan - to ease the hurdles of life.  With the help of Knowledge even the toughest challenge seems to be very easy, however, the process or journey of gaining Knowledge may be tough in its own.

For the folks who do not know - the punch line "kyuki Gyan hi aapko apna hak dilata hai" has been taken from the biggest Indian reality show named "Kaun banega crorepati" (the remake of - Who wants to be a millionare - an american series). The show is hosted by the biggest Indian superstar from the Indian film industry - Amitabh Bachchan. If I take the privilege to give a brief about him then I would say only one statement that even if he does not speak anything in his film, the expressions in his eyes would tell the entire story. He is a maestro.  If I talk more about his show then I can say that this is the show that puts your general knowledge to a tough test and awards you a big price money if you answer the questions correctly. Everyone loves him and his show, including me.  I watch the show almost regularly except for the instances when I need to stay late in the office to talk to my client to earn my bread-butter!  I miss the show because my TV is not yet equipped with the set-top box to record his shows. We all love him

Gyan - is not the modern word. Its definition has not changed since we have grown up hearing it. I am writing this blog to promote the acquisition of Gyan in my field of Software Engineering.  Software development or writing is amongst the top field that requires innovation and lots of knowledge of computers.  A field that creates a computer program to ease your day-to-day activities.  This blog post majorly deals with the importance of achieving the knowledge in software field and putting it to the real world.  I would be happy to site my personal experiences and the methodologies that I followed during my entire career to achieve the respectable knowledge in the software industry.

Starting with my college days, The first program that I wrote in C/C++ language (I guess that was the most recent programming language during the period of 2004-05) was to find the prime numbers between 1 to 10000.  Though this program does not make any sense in the contemporary real world and doesn't contain any business value, it was the opportunity for me to put my mind into words. The program had lot of defects which I couldn't foresee before writing it.  When I wrote it, I realized what went wrong where and how.  It gave me an opportunity to fix the defects which my brother already told me when he tested it!  Naah, nah.. and yeah yeh went for a very long time.  I really wanted myself to sustain the arguments which my BigB (elder brother) wanted me to get involved into but in vain.  Yes, it was almost in vain. I cannot sustain what he was expecting from me.  A good quality prime number finder??  I then worked very hard to get the things resolved.  Finally I saw him showing me his right hand thumb as a token of the success and I took a deep breath of relaxation.  Gyan enabled me to enrich my program. The knowledge of fixing defect in my 50-line program helped me to achieve what I was expecting. I sticked to the problem and I was able to put it to closure - happy ending!.  I was given flexibility to search google and find the answer and I could do it but unfortunately the code I found on G was in VB and I did not know VB that time.  I also did not know about the freely available code converters who could convert the ready-made VB code to C++.  I wrote it from scratch and published. While working on this program, I not only learnt how to find prime numbers but I could also achieve supplementary knowledge which can be applied to other programs if I happen to write at a later point of time.  Knowledge is about putting a lot of efforts to achieve great things.  I remember the words of my grandma saying "It might be very easy to appease Lakshmi (the goddess of Wealth) but very very difficult to appease Saraswati (the goddess of Knowledge)".  It requires perseverance, diligence and focus.

The process of writing C++ code was not enough and I faced a new challenge to writing the same program in assembly language (for the 8086 microprocessor). Oh God!  The instruction code does not fit my mind and I am fed up of all JC, JE, kind-of instructions.  Gyan - again pitched in. I was engrossed in learning the new (yet old) programming language for the sake of converting my program into assembly language. I am pleased to see that modern age compilers do it for free now!

Like others, I also wanted to reap the fruits of my knowledge almost instantly but it never happened.  It might be because the knowledge that I had was superficial or inaccurate initially.  Gradually when I progressed more on the path of life, I found the knowledge which I gained initially helped me to solve many questions/puzzles of my life. The only thing that I missed was to polish it.  I found it very interesting and started acquiring it more and more.  Really we never know when we will get chance to apply our knowledge that we acquired in the past.  The best way to retain this knowledge is to keep practicing it till it gets a comfortable seat in your mind. The knowledge in any form should be welcome without analyzing the degree of success we can achieve by using it. Sometimes we tend to put lot of efforts in gaining the knowledge but we get frustrated if we are not able to reap the fruits instantly.  Never mind!  The knowledge never goes in vain. It will be used somewhere at some point of time in rest of your life and can fetch you better fruits.  We all know the stories of two woodcutters who use to earn their livings by cutting the woods in the forest. The one of them was yielding better results because he use to sharpen his axe everyday.  It was the knowledge of how the axe works and what is needed to make it work better helped the woodcutter to get better results. 

The next question that comes to my mind is "how do I keep myself interested in gaining the knowledge"? The answer to this question is to keep craving to polish your knowledge.  It is quite true that there is always a scope to polish your knowledge in whatever form you have it. Studies have proven that a person may tend to stop acquiring knowledge when he gets bored. Similarly organizations may not succeed in its goals and vision if its employees complain that they are getting bored. The best way to keep ourselves motivated is to ask ourselves a question: "what did I learn today?"  The answers to this question will surely make ourselves more active and our attitude towards the journey of acquiring knowledge will change drastically.  Sometimes we get bored from our routine life because we fail to notice the interesting aspect of it.  When I wake up and realize it is the same Monday that comes every week and the same status update calls in the evening, I tend to loose the craving of reaching the office in enthusiasm.  The situation can tend to be different when I am successful in realizing that I am going to use new technology or a new algorithm today to solve the problem at hand.  This may also be explained in terms of science.  When we are excited our body tends to discharge more adrenalin which gives us a natural thrust in doing something creative.

These were few instances of Gyan playing important role in keeping ourselves motivated and free from bad happenings.  The process of achieving the great is very lengthy and may take considerable efforts as estimated by us. The process of acquiring Gyan never ends.  It keeps on growing. The modern world has been equipped with lots of technology stuff which may seem to be growing at the very rapid pace.  Gyan is the only mechanism which can lead us to explore every aspect of our day-to-day life.

I took a very fundamental examples of my life of learning.  I feel that learning never ends and I second many great learned spiritual preachers. We know that we need knowledge to unlock the miseries and mysteries of darkness of the life.  The enlightenment of life can happen only with the proper blend of knowledge and experiences.  The purpose of life tends to give you the best when we know the importance of knowledge and oppose ignorance.  The barriers of life and society can be eradicated only with the help of true knowledge.  Coming back to KBC - I heard Amitabh Bachchan saying that "Gyan kisi ki jaagir nahi hai.. woh to sirf bhatko ko raah dikhata hai" (meaning the Knowledge is not the property of anyone but it gives the proper direction to the ones who have lost their way) - Yes, kyuki Gyan hi aapko apna hak dilata hai!

Sunday, January 6, 2013


REST-based WCF services – and the masters of the universe!

The WCF framework is around us for quite a while now. Microsoft has amended the WCF framework to expose the WCF services as the REST based service using special attributes. The REST service can be accessed as easily as accessing the URL. Let’s have a look at some practical applications of REST based services. The explanation of the basics of the REST based services is out of the scope of this post. However we can get the information about it on external websites.

Creating REST based service using Visual Studio.NET
The creation of the REST based service is very similar to WCF service. The WCF service can be made REST based WCF service by applying the attribute [WebGet] on the WCF service function. The WebGet takes the parameter with the name UriTemplate that identifies the URL format to access the function.
The REST based service uses webHttpBinding for the service endpoint.

Server:


Client:



REST Service: “I have the power of the universe!”
The REST based service can seamlessly integrate any client with the server easily. The client could be a .NET client or the smart phone application. For this post, let’s discuss how the smart phone client can integrate with the REST based endpoint of the WCF service. There are various platforms available to program the smart phone applications. The most widely amongst them are iOS, Andriod and Windows Phone. These programming platforms provide the wide range of libraries to connect to the web service endpoints.  The typical application on the phone sends the request to the service and receives the data from the server in the serialized format (JSON & XML) known as response packet. The phone application deserializes the JSON/XML packet and displays the data on the application. The entire scenario is a very good example of disconnected application that involves client and server.  
The phone application can also help to perform CRUD operations using GET/POST verbs on REST-based endpoint. The typical scenario can be discussed as follows:

1.  GET (Used to fetch the records from the database):


2.  POST (Used to insert the record into the database)


The REST based service is very easy way with which the client applications written using the different platforms can communicate with the server to receive the data from the server. This communication becomes possible using serialization of the data. It is analogous to accessing the webservice using SOAP request and response packets. The webservice uses XML serialization whereas phone application widely uses JSON serialization.  The programming of the REST based service has just gained enough space in the server programming because it is easier to integrate with phone and hand-held devices. Now it is very easy for your phone to answer your questions like "find me the thai restaurants near my area", "write few reviews of some movie from your phone which can be seen on the website".  There is lot more to come in this space!

Stay connected!

Saturday, July 14, 2012

Selecting a high-end software development computer

Computers can be tagged as a multipurpose device. Nowadays it is being used in all the profession. Also, there have been lots of changes in the way computers are built in the present era. Their overall architecture evolves very rapidly. Someone from non-IT background asked me about the ideal configuration of the computer that he can own and my answer to that was “it depends”. The requirements and the usage scenario matters a lot while selecting a computer or a notebook. As an example, the journalist may need a computer to maintain the track of the notes and upcoming unpublished articles. The basic configuration is enough to satisfy his needs. On other hand, the software developer may need an advanced configuration because he is expected to run a whole lot of his programming software and some in background as well. Once we narrow down on our usage scenarios it becomes very easy to select one from the authorized reseller. Let’s discuss each and every component of the computer and decide upon the ideal configuration. We will also discuss some of the key terminologies used by hardware vendors for marketing their products.

Microprocessor (Category: Performance): As we all know it is the heart of the computer. It synchronizes the operations of all the hardware devices present on the mainboard. For a developer, the ideal processor configuration would be 3 GHz (minimum) because the development tools are getting feature rich day by day. These tools consume lot of CPU cycles to provide the smooth operation to its users. The processor L3 cache also contributes to the speed to some extent. The number of cores is equally important along with the individual core speed. This is due to lot of programs being run at the same time e.g. lots of windows services runs in background while computer is in use, anti-virus real time scanning, outlook for emails, some music, developer’s tools, etc. These softwares perform better when they are allocated a separate core to run them. Their stability also increases a lot if ran on separate core. The processor with thread level parallelism performs even better. This essentially means that n number of threads can run on the same CPU core and hence can utilize the core speed to the fullest. If the development scenario requires virtualization then the virtualization enabled processor can be selected. These processors are programmed to provide better performance, reliability and security while working on the virtual machines. The processors are categorized as 32-bit and 64-bit. The 64-bit processors are required in case the programs needs more physical memory to operate e.g. opening 3 GB log files from production environment or attaching a large files in the email or using lots of extension plugins to extend visual studio IDE, etc.

This lets us to put the configuration for the ideal development machine is: 3GHz - x64 – 4 cores/8 threads – VT

Physical memory – RAM (Category: Performance): The development machine needs more RAM as compared to a non-development machine. The ideal development machine runs lots of programs including virtual machines hence they consume more RAM. I would put in 8GB RAM as bare minimum requirement to run a high end development computer. The RAM speed – FSB (front side bus) is equally important for the RAM to operate at the speed of processor. E.g. 800 MHz RAM will waste more CPU core cycles as compared to 1066, 1333, 1600, 1866, 2166 MHz RAMs. The metrics to identify this efficiency is known as bus-core ratio. This metrics is associated with the processor because it defines the efficiency of the processor. The bus-core ratio can be explained as the number of front bus cycles consumed per number of clocks generated by processor. For a developer machine it is advisable to keep it to maximum compatible value. The RAM processes the data at 1’s of the clock. Some RAM’s processes the data at the positive slope of the clock only which are known as DDR2 whereas some processes the data at both the positive and negative slopes which are termed as DDR3. Due to above reason DDR3 performs twice as better as compared to DDR2. This helps us to categorized RAM as DDR2 or DDR3. The development machine shows up considerable responsiveness while running the softwares on DDR3 RAM.

This lets us to put the configuration of RAM as DDR3 - 8GB – more than 1333 MHz FSB.

Hard drive (Category: Performance): It depends on the number of softwares a development machine is expected to store. It contributes to the performance of the machine to a great extent. Based on their architecture they are categorized into two types: HDD and SSD. HDDs have rotating platters to store the data. The data is retrieved by the flux generated by rotating magnetic platters. The SSD do not have any rotating parts but it contains the chip design logic to store the data. SSD’s are 10 times faster than HDD but they are 4 times more costly. During the operation of the computer the major bottleneck could be the reading of the data from the hard disk in case of page faults. SSD can speed up the data fetch as compared to HDD and thereby improves the performance of the machine considerably giving better responses while doing software development.

This lets us to put the configuration of hard drive as SSD – 120 GB

Notebook screen (Category: Display): The screen of the notebook should be non-reflective while developing a software otherwise it may not be operated smoothly in the natural light. The non-glossy screens, however, reflects the colors in a dull shade. I feel this matters a least in software development scenario because software development is all about writing a code rather than building a user interface. This may not hold true for content writers or graphics designers and they may have to opt for glossy screens

This lets us to put the configuration of screen as 15.6” Matte finish

Graphics processing unit (Category: Display): The GPU matters a least in a software development machine. Let’s select the default option

Keyboard (Category: Operation): The keyboard is the equally important component while selecting the software development machine. Make sure the keyboard keys layout is exactly same as the one you are using right now. E.g. placement of delete or insert key is not different otherwise you may have to put extra efforts to remember the new locations of each key. The keyboard may be a backlit one if you are going to work in dark. The separate numeric pad may or may not be present as it is seldom used during software development. Another important requirement of software development is the usage of the function keys. The function keys should not be clubbed with multimedia buttons otherwise the user has to press fn key along with function key to make it work while debugging or applying any shortcut using function keys.

Speakers (Category: Operation): Only if you are a music fan. Select the default one as there are no options available

Number of USB (Category: Operation): The number of USB matters a lot because some developers have a habit of using external mouse, keyboard, and USB modem along with USB headsets, printer/scanner, USB device chargers. Hence it is advisable to get as many more USBs as possible. The ideal count would be 4-5.

The excellence of the development machines lies in their individual components and hence they should be picked up with utmost care. The developer should avoid getting attracted by the marketing tag lines of providing a fastest computer on the planet. Instead have a look at the individual components and compare it with the other brands and select the one that suits your requirements. It is also true that the most advance computer of today’s date will get outdated 3 years down the line and hence needs to be upgraded or a new purchase. Hope the post covers everything to look for while purchasing a new development machine.

Friday, July 6, 2012

Multithreading & parallel processing

Multithreading is the feature of the programming language that enables the developer to do the parallel processing in the software. You can listen to music while coding! The operating system creates process for each of the task to perform in parallel. The process is scheduled to run on the processor. The process can have one or multiple threads. The execution logic can be divided into multiple threads to perform the concurrent execution. e.g. The Dal thread can fetch the data from the database and UI thread populates them on the UI thereby reducing the wait time of the user to see the results. The programming languages provides lots of API's to operate on the threads and to manage them. In this blog post, lets discuss about the threads and paradigm of parallel programming.


In computer science terminology a thread of execution is the smallest unit of processing that can be scheduled by an operating system. A thread is a lightweight process. The implementation of threads and processes differs from one operating system to another, but in most cases, a thread is contained inside a process. (ref Wikipedia)

In terms of programming language, there can be two types of thread - main thread and worker thread. The program is run in the main thread. The worker thread is created from the main thread to share the workload of the main thread. The worker thread is joined to the main thread to consume the result in the main thread. The figure below depicts the scenario of the multithreaded application



The figure shows the main thread spawning the worker threads to execute the code in parallel. The worker thread executes the logic independently of each other and finally joins the main thread to proceed the further execution on the main thread. The main thread is blocked until both the worker threads joins the main thread. The worker threads are allocated separate CPU core to execute the logic at the full core speed. This is the main reason why multithreaded application performs better than single threaded application. The sample C# code for multithreaded application is as shown below:



The main thread creates two threads viz. thread and thread1. These threads have a worker method associated with them. The threads starts execution when Start() method is called on each one. The sequence in which the thread starts and completes the execution is non-deterministic and depends solely on the processor and thread management component of CLR.

.NET framework methods to operate on thread:

  1. Start - Start the thread execution
  2. Suspend - Suspend the thread execution
  3. Resume - Resumes the execution of the suspended thread
  4. Abort - Terminates the thread

Race condition:

The non-deterministic execution behavior of the thread results in the condition which is often referred to as race conditions. In race conditions the final output may vary depending upon the sequence of the execution of the thread. Hence every run of the program may result in different output. The race conditions can be avoided by using synchronization techniques between the producer and consumer threads

Inter-thread and inter-process communication:

As discussed in above sections, the thread is an independent unit of execution. The worker method is expected to have all the data required for it to execute independently and flawlessly. However there are scenarios where one worker thread may depend on the data that is generated by another worker thread. Hence there needs to be a communication channel between worker threads. This mechanism is known as inter-thread or inter-process communication. Wikipedia explains it as - "In computing, Inter-process communication (IPC) is a set of methods for the exchange of data among multiple threads in one or more processes". There are various types of IPC methods: message passing, synchronization, pipes, shared memory, etc.

  1. Message passing: The mechanism to enable the thread to exchange the messages between each other. The messages should be generated based on the predefined schema and so that they it can be interpreted by the consumer threads. The typical example of message passing mechanism is MSMQ. The MSMQ stores the messages in the xml format. The format should be well understood by the consumer threads so as to enable consumer thread to read the messages that are generated by producer thread

  1. Synchronization: The mechanism by which the thread synchronizes themselves with the other threads so that all the threads uses the fresh and accurate copy of the data to avoid discrepancies in the final output is known as synchronization

  2. Pipes: This methodology sequences the thread execution and pass the result of one thread as an input to other.

  3. Shared memory: This methodology reserves some amount of the physical memory for all the threads to spool the data which can be shared by other threads. The memory area needs to be protected so as to ensure that one thread enters the shared memory at any point of time. These are often referred to as critical sections.

Let's discuss synchronization and shared memory concepts in details.

Thread Synchronization: The thread synchronization is the mechanism with which the thread suspends or resumes its processing based on the state of the other thread. This helps the thread to operate on the correct copy of the data generated by the other threads. This is achieved by ensuring that only one thread enters the critical section and operates on the data while other threads waits (blocks) for the previous thread to exit the critical section. The thread signals the other threads when it exits the critical section so that the other threads which are waiting for the previous thread get chance to enter the critical section and operates on the data. There are several methodology to perform thread synchronization viz. mutex, semaphore, monitor

Types of thread synchronization:

The thread synchronization helps to maintain thread safety. The various methods to achieve thread synchronization is as follows:

  1. Mutex: The flag used by the threads to stop the other threads from entering the critical section. The thread sets the flag after it has entered the critical section and resets it when it exists the critical section. The other threads wait (busy waiting) on the flag and checks whether the flag has been reseted. They get chance to enter critical section when they see the flag has been reseted by the previous thread. In terms of programming language the mutex is essentially a boolean variable that hold the value of true/false.

  2. Semaphore: It is the special implementation of mutex except that it maintains the available count of the resources. This enables the operating system to avoid the race conditions because it predefines the number of resources that can be allocated. These are known as counting semaphores.

  1. Monitor: In concurrent programming, a monitor is an object or module intended to be used safely by more than one thread. The defining characteristic of a monitor is that its methods are executed with mutual exclusion. That is, at each point in time, at most one thread may be executing any of its methods. This mutual exclusion greatly simplifies reasoning about the implementation of monitors compared to reasoning about parallel code that updates a data structure (ref: Wikipedia)

.NET framework implementation of the thread synchronization and thread safety

  1. Lock: This keyword is used to ensure that only one thread enters the critical section. The lock is internally compiled as the monitor. The lock operates on the object type data structure. The sample code may look like as follows:



  1. Events: The events are the messages that are communicated between two threads. The event has two states: Wait and Signal. The wait state causes the thread to wait till the other thread does not signal it. The signal state enables the thread to enter the critical section which means the previous thread has exited the critical section. There are two kinds of synchronization events in .NET:
    1. AutoResetEvent: The event is signaled automatically if the thread is waiting on auto reset event and the auto reset event returns to un-signaled state after the thread is allowed to enter the critical section. The thread needs to call Set() to explicitly signals the auto reset event which then allows other threads to enter the critical section



  1. ManualResetEvent: The thread resets the event and enters the critical section. After completing the execution the thread calls Set() to allow other waiting threads to acquire the lock to enter critical section.


  1. Monitor: The monitor is implemented using Enter() and Exit() methods on the Monitor class in .NET. The critical section is placed between Enter() and Exit() methods so that monitor ensure only thread enters the critical section at any given point of time.


Thread pooling:

The thread pooling is the mechanism with which the thread is allocated from the list of available threads to execute the user work items. The ThreadPool class in .NET enables the framework to spawn and schedule the thread for individual work items of the user. Essentially the thread pool is implemented as a queue (FIFO). The scheduling of the user work items is handled by operating system and there is no guarantee that the thread is scheduled immediately after adding the user work item to the queue. The thread returns back to the thread pool once it completes the current work item for the user so that it can be scheduled once again after sometime.


Writing the thread-safe code:

The thread-safe code is defined as the code that does not allow other threads to modify/update the data used by current thread. The code is thread-safe only if it safely manipulates the data in the shared memory without hampering the execution of the other threads. There are numerous ways to achieve thread safety in the multithreaded application. Let's discuss the obvious approach to write the thread-safe code i.e. reentrant code. For other methodologies to write the thread-safe code please refer to section: .NET framework implementation of the thread synchronization

Reentrant code: The reentrant code is the code which gets the data from the main/parent thread for itself to run independently. The thread maintains its own copy of the data instead of using the global data. This helps the thread to avoid using the data that can be modified by other threads because all the thread has local copy of the data. The reentrant code executes with multiple threads and ensuring the result is consistent across the thread execution. This is achieved by storing the data locally on the thread stack instead of global variables

Deadlock:

The system is said to be in deadlock if each thread is waiting for the other to release the non-preemptive resource and there is a circular wait amongst these threads. It is very critical to detect the deadlock and prevent it from happening because the system can go into endless wait and never comes up again. The detailed discussion of the deadlock is outside the scope of this blog post.

Multithreading on single-core and multi-core machines:

The multithreaded application does not yield good results when run on the single-core machine the code is run on the single core. There is also an overhead involved in managing the thread which is done by CLR e.g. context switching, loading the state from previous run, etc. The thread execution on the single core machine happens on the concept of the time-slice wherein all the thread executes on the processor for a given time span before the next thread is scheduled. The multithreaded application performs better on the multicore machines. The multicore machines has an ability to schedule the threads on the different cores hence achieves the parallel processing in the true sense


Usages of the multithreading concept:

Most of the softwares especially the games and the enterprise products available today makes use of heavy multitasking. These softwares needs a very high quality of the hardware configuration including the multicore processors.

Future of the multithreading:

Nowadays the multithreading has become a basic necessity of any software. There has been a great evolution in the processors as the manufacturers are putting more cores on the processor die. This facilitates the software writers to use multithreading extensively in their products to increase the responsiveness of the product.