Monday, May 24, 2010

Parallel Computing and .Net

Since the launch of .net 4.0 a new term that has got into lime light is parallel computing. Does parallel computing provide us some benefits or its just another concept or feature? Further is .net really going to utilize it in applications ? These are some questions on which I am going to discuss in this post. To know more about parallel computing and its application in sql-server database (code named Madison) refer to my post on the topic.

Introduction ... What is Parallel Computing ?

Parallel computing is a form of computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently ("in parallel"). There are several different forms of parallel computing: bit-level, instruction level, data, and task parallelism. Parallelism has been employed for many years, mainly in high-performance computing, but interest in it has grown lately due to the physical constraints preventing frequency scaling. As power consumption (and consequently heat generation) by computers has become a concern in recent years, parallel computing has become the dominant paradigm in computer architecture, mainly in the form of multicore processors.


The benefit to parallel computing is overall speed of execution. As you may have noticed over the past few years, processors aren't getting any faster, but the number of CPU cores per system is increasing. Parallel programming is the means by which you can take advantage of this form of upgrade, by splitting large jobs into smaller tasks that can be handled concurrently by separate cores.

Parallel Computing and .Net

Many personal computers and workstations have two or four cores (that is, CPUs) that enable multiple threads to be executed simultaneously. Computers in the near future are expected to have significantly more cores. To take advantage of the hardware of today and tomorrow, you can parallelize your code to distribute work across multiple processors. In the past, parallelization required low-level manipulation of threads and locks. Visual Studio 2010 and the .NET Framework 4 enhance support for parallel programming by providing a new runtime, new class library types, and new diagnostic tools. These features simplify parallel development so that you can write efficient, fine-grained, and scalable parallel code in a natural idiom without having to work directly with threads or the thread pool. The following illustration provides a high-level overview of the parallel programming architecture in the .NET Framework 4.



CPU performance growth as we have known it hit a wall two years ago. Most people have only recently started to notice. You can get similar graphs for other chips, but I’m going to use Intel data here. The following graphs the history of Intel chip introductions by clock speed and number of transistors. The number of transistors continues to climb, at least for now. Clock speed, however, is a different story.


A good video explaining the benefits and use of parallel computing in .net is below:

Another good video for it is below:

Views from Intel on parallel computing and .net are as follows:


For more reading on the topic refer MSDN article. For examples showing the use of parallel computing in .net refer MSDN Examples Article.

To know more about parallel computing and its application in sql-server database (code named Madison) refer to my post on the topic.

Do update me with your views and suggestions on the post.


Friday, May 14, 2010

Cloud Computing and .Net

With the release of .net 4.0 Cloud computing has become a buzzword in the IT space. In this post I am going to describe in brief what Cloud computing and where it stands in the relative to .net and Microsoft technologies. For more about types of clouds in cloud computing read my another article. For an example of implementation of cloud computing by Google read in article on Google Cloud Print and Google Cloud Connect. Also read my another article on Cloud Computing and Open Source.

Introduction

Cloud computing is a general term for anything that involves delivering hosted services over the Internet. One can also say that Cloud computing is Internet-based computing, whereby shared resources, software and information are provided to computers and other devices on-demand. These hosted services are broadly divided into three categories: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS) and Software-as-a-Service (SaaS). The name cloud computing was inspired by the cloud symbol that's often used to represent the Internet in flowcharts and diagrams.

A cloud service has three distinct characteristics that differentiate it from traditional hosting. It is sold on demand, typically by the minute or the hour; it is elastic -- a user can have as much or as little of a service as they want at any given time; and the service is fully managed by the provider (the consumer needs nothing but a personal computer and Internet access).

It is a paradigm shift following the shift from mainframe to client–server that preceded it in the early 1980s. Details are abstracted from the users who no longer have need of, expertise in, or control over the technology infrastructure "in the cloud" that supports them. Cloud computing describes a new supplement, consumption and delivery model for IT services based on the Internet, and it typically involves the provision of dynamically scalable and often virtualized resources as a service over the Internet. It is a byproduct and consequence of the ease-of-access to remote computing sites provided by the Internet. A good video explaining the cloud computing basics is as follows:


Microsoft and Cloud Computing

In perspective of Microsoft Technologies the cloud computing is the technology that is going to be the backbone of most of the applications that runs on internet. Microsoft and other competitors, such as Yahoo, Amazon, Google, and IBM, have been building cloud-computing infrastructure and new software at a rapid pace to service the large number of potential users. Microsoft’s business now depends on an ever-expanding network of massive data centers: hundreds of thousands of servers, petabytes of data, hundreds of megawatts of power, and billions of dollars in capital and operational expenses. Because these data centers are being built with hardware and software technologies not designed for deployment at such massive scale, many of today’s data centers are expensive to build, costly to operate, and unable to provide all the services needed for emerging applications—resilience, geo-distribution, composability, and graceful recovery.

A good video explaining cloud computing in perspective of .net is below:

Two broad factors drive are driving the Cloud Computing development at Microsoft. The first is the shift by Microsoft and the software industry to delivering services along with their software. The term “services” encompasses a broad array of Internet delivery options that extend far beyond browser access to remote Web sites. At one end are Web 1.0 applications—Hotmail®, Messenger, search, and online commerce sites—and Web 2.0 applications—social networking, for example. An emerging suite of more sophisticated applications, such as business intelligence and rich games, are improved fundamentally when local clients are connected to services. Such connections enable entirely new features such as a new generation of immersive, interactive games; augmented-reality tools; and real-time data analysis and fusion. To provide services, a company must have a large number of computers housed in one or more data centers.

The second factor driving this research is the way cloud services and their support infrastructures are constructed. Today, they are assembled from vast numbers of PCs, packaged slightly differently, connected by the same networks used to deliver Internet services. Building data centers using standard, off-the-shelf technology was a great choice in the beginning. It let the Internet boom race ahead without the need to develop new types of computers and software systems. But the resulting data centers and software were not designed as integrated systems and are less efficient than they should be. One common analogy is that if one built utility power plants as we build data centers, we would start by going to Home Depot and buying millions of gasoline-powered generators.


Many researchers have seen an opportunity to make major improvements in the way data centers and cloud services are built, but this type of research and technology transfer is difficult because the efforts often cross many research disciplines. Effective research requires changes to both hardware and software, and the resulting prototypes must be constructed and tested at a scale difficult for small teams. For this reason, Microsoft is taking an integrated approach, drawing insights and lessons from Microsoft’s production services and data-center operations, and partnering with researchers and product teams worldwide.



A good video explaining more about Azure is below:

For more research in this area Microsoft has made a research organization called Cloud Computing Futures (CCF).

The commodity components and handcrafted software currently used to build cloud services introduce costly inefficiency into Microsoft’s business. Designs based on comprehensive optimization of all attributes offer an opportunity to create novel solutions that produce fundamental improvements in efficiency:

  • Creating new hardware and software prototypes.
  • Advancing the holistic design philosophy.
  • Innovating with instrumentation and measurement, data acquisition, and analysis.
  • Engaging Microsoft product groups and outward-facing properties.


CCF goal is to reduce data-center costs by fourfold or greater while accelerating deployment and increasing adaptability and resilience to failures, transferring ideas into products and practice. To date, we have focused our attention on four areas, though our agenda spans next-generation storage devices and memories, new processors and processor architectures, system packaging, and software tools:

Low-power services: The computers (“servers”) used to support cloud services are some of the fastest, most power-hungry computers built. The common wisdom has been to use the fastest computers because the workload is potentially huge and purchasing, installing, maintaining and operating computers is a complex task, so the fewer the machines, the better. But other computers, such as laptops, are far more energy-efficient, as measured in operations per joule, and can complete a unit of work with far less electricity and less cooling. These computers are not as fast as servers, though, and more of them are required to deliver the same service.






CCF has built two server clusters using low-power, Intel Atom chips and is conducting a series of experiments to see how well they support cloud services and how much their use can reduce the power consumed by those services. For example, power-efficient computers have low-power states, such as a laptop’s sleep and hibernate modes, that greatly reduce power consumption. We have built an intelligent control system called Marlowe that examines the workload on a group of computers and decides how many of them should be asleep at any time to reduce power consumption while still meeting the service’s acceptable level of performance.

In addition, they have worked with the Hotmail® team to evaluate the utility of low-power servers for the Hotmail® service. These experiments—the Cooperative Expendable Micro-Slice Servers prototype—have shown that overall power consumption can be reduced compared with standard servers while still delivering the same quality of service.

Improved networks: The networks that connect the computers in data centers use the same hardware and software as the rest of the Internet. It is great technology, but many of the design decisions that make it possible to transmit traffic across the globe to a vast, rapidly changing collection of computers are inappropriate for a cloud-service computing infrastructure consisting of a large, but fixed, collection of computers in a single room. Data-center networks are costly and impose many constraints on communications among data-center services, making writing cloud-service software far more difficult.

CCF have been working with researchers from Microsoft Research on several approaches to data-center networking. The most mature of these is Monsoon, which uses much of the existing networking hardware but replaces the software with a new set of communications protocols far better suited for a data center. This work will not only lead to more efficient networks, but by relaxing the constraints of existing networks, it also will open new possibilities to simplify data-center software and to build more robust platforms.

Orleans software platform: The software that runs in the data center is a complicated, distributed system. It must handle a vast number of requests from across the globe, and the computers on which the software runs fail regularly—but the service itself should not fail, even though the software is continually changing as the service evolves and new features are added. Orleans is a new software platform that runs on Microsoft’s Windows® Azure™ system and provides the abstractions, programming languages, and tools that make it easier to build cloud services.

Future cloud applications: To test the CCF hardware prototypes and the Orleans software platform, we are exploring future application scenarios that go beyond our current cloud workloads. These scenarios integrate many ideas from across Microsoft in areas such as computer vision, virtual reality, and natural-language processing.

The perspective of Microsoft products in respect to Cloud Computing can be summed up in the following image:


Following is a slide that explains what all to keep in mind while converting an existing Asp.Net application to Windows Azure so as to use cloud computing. Here quite good points have been highlighted that one need to keep in mind.





Benefits of Cloud Computing

There are some clear business benefits to building applications using Cloud Computing A few of these are listed here:

Almost zero upfront infrastructure investment: If you have to build a large-scale system it may cost a fortune to invest in real estate, hardware (racks, machines, routers, backup power supplies), hardware management (power management, cooling), and operations personnel. Because of the upfront costs, it would typically need several rounds of management approvals before the project could even get started. Now, with utility-style computing, there is no fixed cost or startup cost.

Just-in-time Infrastructure: In the past, if you got famous and your systems or your infrastructure did not scale you became a victim of your own success. Conversely, if you invested heavily and did not get famous, you became a victim of your failure. By deploying applications in-the-cloud with dynamic capacity management software architects do not have to worry about pre-procuring capacity for large-scale systems. The solutions are low risk because you scale only as you grow. Cloud Architectures can relinquish infrastructure as quickly as you got them in the first place (in minutes).

More efficient resource utilization: System administrators usually worry about hardware procuring (when they run out of capacity) and better infrastructure utilization (when they have excess and idle capacity). With Cloud Architectures they can manage resources more effectively and efficiently by having the applications request and relinquish resources only what they need (on-demand).

Usage-based costing: Utility-style pricing allows billing the customer only for the infrastructure that has been used. The customer is not liable for the entire infrastructure that may be in place. This is a subtle difference between desktop applications and web applications. A desktop application or a traditional client-server application runs on customer’s own infrastructure (PC or server), whereas in a Cloud Architectures application, the customer uses a third party infrastructure and gets billed only for the fraction of it that was used.

Potential for shrinking the processing time: Parallelization is the one of the great ways to speed up processing. If one compute-intensive or data-intensive job that can be run in parallel takes 500 hours to process on one machine, with Cloud Architectures, it would be possible to spawn and launch 500 instances and process the same job in 1 hour. Having available an elastic infrastructure provides the application with the ability to exploit parallelization in a cost-effective manner reducing the total processing time.

Read my another article on parallel computing and .net for more.

Status as of 2010

As of year 2010 the status of cloud market and its strategy is well described in the following illustration (For more details refer this article.)



A brief overview of the vendors of cloud and their current status is as follows. But note this does not include all the vendors and is not exhaustive. It's only to give a handy overview of cloud market in particular.


If you want to try out cloud computing for demo there are many vendors providing free cloud computing service. Here is a link to one such vendor RightScale. Another one of CloudSigma.

Further Readings

For more about types of clouds in cloud computing read my another article.
 
A complete list of  Cloud platform providers is maintained here. Refer it for getting list of providers.

Also as nothing comes for free :) one would like to know that how much Window Azure will cost us. For a complete detailed list of price rate of various services of Windows Azure refer pricing page

Further many friends have asked if Windows Azure can support Java applications too. The answer is YES which is good news for Java developers. The following image would make it more clear:



Windows Azure is supporting Java applications too, for more refer this msdn starter kit. Further also refer an open source project named windowsazure4j which is to provide software development kit for Windows Azure and Windows Azure Storage in respect to Java.


For an example of implementation of cloud computing by Google read in article on Google Cloud Print and Google Cloud Connect. For implementation of it by Amazon read Amazon Cloud Drive and Player article. Also read my another article on Cloud Computing and Open Source.

Keep me updated with your views and thoughts on the topic of cloud computing and .net.

Future looks awesome !

These days we often come across things that are upcoming in the market and are latest in technology. This post is a complied from various new things that I came across while surfing the net. Just relax and admire...



Enjoy the new technology facilities introduced into the market.
New Buildings Designs in Korea
 


A Computer with 3 Screens


Apple Curved Screen

Table Computer


A Transparent Lighter Shape Mobile Phone



The Weirdest Computers ever from Samsung

A mobile Phone/Computer with Expandable Screen


A Compass Touch Screen Telephone



A New Mouse Design

Sony’s Bendable Screens


A side Lamp Computer



Beautiful Faucet Design

A New Bathtub

Clothes that saves energy during day and light during night


Multipurpose Remote Control

Future Kitchen

Intelligent Furniture

Electronic Paper for easy correction


 A Bicycle for the whole family

Foldable Office


A new camera from Samsung

A new MP3 player

A mobile Cooking Station

New Fish Tanks Design

Foldable TV Screens


 
  






Do keep me updated with your views !

LinkWithin

Related Posts with Thumbnails