Monday, December 26, 2011

What does a Google Tablet mean?


“In the next 6 months, we plan to market a tablet of the highest quality” – Eric Schmidt

The above move from Google had to come sooner or later. Most of us will agree that the year 2011 was the year when Google pushed really hard to bring out a competitor to the hugely successful iPad. Has Google been successful? Let’s look at some numbers for finding out answers.

According to a recent report that was published by IDC (International Data Corporation), Apple’s iPad leads the tablet market share with about 61.5% while Google’s Android tablets took the second place with 32.4% and Samsung having the highest market share (among Android) with 5.6%. So, a group of big names like Samsung, Motorola, HTC, Asus etc. who supported Android couldn’t quite match up to Apple’s iPad which continues to dominate the tablet market even today. From the start of 2011, when everybody witnessed the much awaited XOOM tablets from Motorola to the end of 2011, when everybody saw the latest Galaxy Tabs from Samsung, Google has never been able to produce a worthy competitor.  Why?

The answer mainly lies with respect to ownership of hardware and software. As I already said in my previous posts, Apple is always in control of both of its components that make up the tablet: Hardware and Software. So, it precisely knows how to produce a tablet that contains software that can take the maximum advantage of the underlying hardware. This was what was lacking in Google’s tablet strategy. The fact is that Google has always been producing software (with not much of a great idea regarding the hardware’s cohesiveness with the software) which was being modified extensively by other android-tablet manufacturers who put that software onto the tablets. Google never really worked closely with any of the manufacturers as far as tablets are concerned when it ideally should have, as Google is not responsible for the hardware part of the devices. The reason that people can see a great phone like Google’s Galaxy Nexus today is because Google hasn’t left any stone unturned in its desire to produce a phone that can take the user experience to a next level. It has coordinated really well with partner companies like Texas Instruments, Samsung etc. and the result now is there for everybody to see.

So, does it now make sense for Google to produce its own tablet? Absolutely. Many experts have always believed that the stock ROM offered by Google is always better than the modified ROMs like Touchwiz and Sense in terms of general zippiness. There is some truth in these opinions as evidenced by the benchmark scores that the devices get and the kind of lag that some people experience in tablets being produced by manufacturers. The latest truth comes in form of stock Android 4.0 (Ice Cream Sandwich) running in the Galaxy Nexus. Thankfully, Google has realized this and is now wasting no time in bringing to market as what Eric Schmidt said "the tablet of the HIGHEST quality". Go Google.!!

Stay tuned..!!

Tuesday, December 13, 2011

LCD TVs and LED TVs: Are they really different? (Contd.) - 1


Yesterday, we discussed about the truth behind the LED TVs. Today, we’ll go further into the various categories of ‘LED backlit LCDs’ available in the market today.

Currently, ‘LED backlit LCDs’ are available in the market in 2 varieties: ‘Edge-Lit’ and ‘Full-Array’. In an ‘Edge-Lit LED backlit LCD’, LEDs (Light Emitting Diodes) are present in the entire perimeter (periphery) of the television. The backlighting of the screen is achieved with the help of what are called ‘Light Guides’. These ‘Light Guides’ direct the glow towards the center of the screen.

The following are the advantages of these kinds of televisions:
  1. They are very thin (as much as 40% thinner) when compared to ‘CCFL backlit LCDs’
  2.  They consume much less power (as compared to the ‘CCFL backlit LCDs’
  3. They are also very much lighter in weight (In fact, most of them could be wall mounted)
  4. They can produce a bright image with very nice colors and deep blacks

The following are the disadvantages:
  1. The entire screen may not be lit uniformly (The edge of the screen may be brighter than the middle of the screen)
  2. True blacks may not be achieved consistently across the entire screen

In fact, both the varieties: ‘Edge-Lit’ and ‘Full-Array’ LED backlit LCDs have the capacity to produce deep blacks as the LEDs could be simply turned off when no color is being reproduced on the screen.

Now, for the ‘Full-Array LED backlit LCDs’: In these kinds of televisions, several rows of LEDs are placed behind the entire surface of the screen.

The following are the advantages of these kinds of televisions:
  1.  They are thinner (as compared to the ‘CCFL backlit LCD’ variety)
  2. They consume much less power (as compared to the ‘CCFL backlit LCD’ variety)
  3. They are lighter in weight (Again, as compared to the ‘CCFL backlit LCD’ variety)
  4. They can produce ‘True Deep Blacks’ (as compared to both ‘CCFL backlit LCD’ and ‘Edge-Lit LED backlit LCD’)
  5.  The brightness and colors are better (as compared to both ‘CCFL backlit LCD’ and ‘Edge-Lit LED backlit LCD’)

The following are the disadvantages of these kinds of televisions:
  1. ‘Blooming Effect’ (described below) affects the picture quality a little bit
  2. Slightly thicker and heavier (as compared to the ‘Edge-Lit LED backlit LCD’)

The reason that ‘Full-Array LED backlit LCD’ can achieve truer blacks is because whenever blacks have to be reproduced on a significant portion of the display, an entire section of LEDs can be turned off without affecting the display properties of the other LEDs. This property is widely known by the name: ‘Local Dimming’. So, local dimming leads to better blacks but also leads to an effect called ‘Blooming’. If a bright color (LEDs turned on) is being displayed inside a black section (LEDs turned off - Local Dimming), the bright color creeps slightly into the adjacent black portion (like a halo). This effect is called ‘Blooming’. But hey, it is ever-so-slightly-noticeable.

So, there are 4 kinds of ‘LED backlit LCDs’:
  1. ‘Edge-Lit LED backlit LCD’ without ‘Local Dimming’ 
  2. ‘Edge-Lit LED backlit LCD’ with ‘Local Dimming’
  3.  ‘Full-Array LED backlit LCD’ without ‘Local Dimming’
  4.  ‘Full-Array LED backlit LCD’ with ‘Local Dimming’ (Usually, the best of the lot)

I’ll again reiterate. There is no ‘True LED TV’ available commercially in the market right now. A 'True LED TV' can be defined as follows:

Each and every individual pixel should have the capability of being independently brightened or turned off. For this to be achieved, an LED has to be present for each and every pixel. Currently in the market, a ‘Full-Array LED backlit LCD’ with ‘Full HD’ (having a resolution of ‘1920 * 1080’), has only a maximum of 2000 LEDs. If it has to qualify for being a ‘True LED TV’ there should be 2073600 LEDs.

So, go get your TV right now.

Stay Tuned..!!

Monday, December 12, 2011

LCD TVs and LED TVs: Are they really different?


Nowadays, you can see a lot of people spending hours and hours together shopping for the perfect television which would occupy the center-stage of their respective living rooms. With the ubiquitous presence of ever-confusing marketers of the television-manufacturing companies, the decision is becoming all the more difficult. So, let us understand the technologies before you splurge your hard-earned money on them.

Earlier, you used to find televisions with only one kind of technology. They were called ‘CRT TVs’ (Cathode Ray Tube). With the help of electron guns, coils and a fluorescent screen, the CRT TV was able to produce a viewable image. Now, there were problems with this technology
  1. A number of components were used to manufacture the television which made the processes of production, repair and maintenance difficult
  2. The individual components were also bulky which made the television quite heavy  
  3. Some of these televisions had a problem of ‘Image burn-in’ (Permanent discoloration of areas on electronic display) and a problem of ‘Image Loss’ at the boundaries of the display
  4. These televisions produced noticeable flicker at low refresh rates
  5. They consumed high power and generated a lot of heat

To overcome the drawbacks of this technology, manufacturers started producing LCD (Liquid Crystal Display) televisions. An LCD is a flat-panel display that makes use of light modulating properties of liquid crystals. Now, these liquid crystals do not emit light by themselves, so a backlighting source is required in cases where there is higher power consumption. To understand this, you need to take note that there are two kinds of LCDs: ‘Passive Matrix LCDs’ and ‘Active Matrix LCDs’. In ‘Passive Matrix LCDs’ like ‘Alarm Clocks’ and ‘Calculators’ where the power consumption is less, a backlighting source is usually not required. Contrast this with ‘Active Matrix LCDs’, where some sort of backlighting mechanism is required. Now, this backlighting has been traditionally achieved with a ‘Cold Cathode Fluorescent Lamp’ (CCFL) in an LCD Television. So, the ‘CCFL backlit LCD’ television overcomes the problems of CRT in the following fashion
  1. They don’t use phosphor. So, the problem of ‘Image burn-in’ is eliminated
  2. They can be configured to run at high refresh rates. So, the problem of flicker is eliminated
  3. When compared to the CRT television, the components used are lighter in weight, so the heaviness of the television is reduced
  4. They consume lesser power and generate lesser heat while lighting the entire screen uniformly
  5. There is no ‘Image-Loss’ at the boundaries and the entire screen is viewable

Consequently, manufacturers found out that they could improve upon these televisions too by changing the backlighting mechanism. They found that they could bring about the following changes
  1. Reduce the power consumption even further
  2. Reduce the thickness of the display even further
  3. Increase the image quality even further
  4. Increase the brightness of the display even further
  5. Reduce the weight of the display even further

The manufacturers used LEDs (Light Emitting Diodes) as a backlighting source instead of the conventional CCFL sources and were able to achieve all-the-above stated objectives. So, this gave birth to the ‘LED backlit LCD’ televisions. Currently, if you go shopping for an LCD television, you won’t find a true LED TV. The only things you will find are ‘LED backlit LCDs’. So, don’t expect to see a radically different picture altogether as compared to a ‘CCFL backlit LCD’, albeit the images produced by the ‘LED backlit LCD’ televisions are in fact much better. Even these televisions can be further classified into 4 different categories. I'll explain these categories in the next post

Stay Tuned..!!

Tuesday, December 6, 2011

AH-IPS, LTPS and ASV: Decoding the Smartphone Display


Well... If you would have read my first two posts of this blog, you would have by now become familiar with most of the display technologies being used today. Here are 3 more:

This display technology has been invented by LG. It is an abbreviation for Advanced High performing - In-Plane Switching LCD. It is being touted as the actual competitor to Samsung's 'Super AMOLED' display. In a competition held by 'Intertek', LG's AH-IPS display actually defeated Samsung's Super AMOLED in 2 parameters: Color-Accuracy and Power-Efficiency. You can find this display in the recently released LG Nitro HD.

This is an abbreviation for 'Low Temperature Poly SiliconLCD. One of the important characteristics of such displays is that, the drive circuits are directly integrated onto the glass surface which contributes to reducing the number of component parts designed on the outside substrate. This leads to the display's durability being enhanced. The reduced size of the TFT section leads to a crisper display and also leads to lower power consumption. You can find this display in the Lenovo LePad S2005.

This is an abbreviation for Advanced Super View display. The important characteristic of this display is that it offers excellent viewing angles and competes with the In-Plane Switching (IPS) LCDs. You can find this display in theMeizu MX.

Tuesday, November 29, 2011

Benchmark Tests for Android Devices: Browsermark, Neocore, AnTuTu and Vellamo


Let's get to know the remainder of the benchmark tests that can gauge the performance of your smartphones/tablets.


Broswermark
Rightware, recently spun off from Futuremark, has introduced the 'Browsermark' benchmark test in order to compare the browsers of various smartphone devices. The test measures a browser's performance in JavaScript and HTML rendering. The test results are reported in numbers. If you want a reference point for your device, currently, among the smartphones, the 'Samsung Galaxy Nexus' has the highest reported browsermark score of 98272.


GL 2.1
The 'GL Benchmark' is a 3D benchmarking program designed to test how well your phone can reproduce 3D scenes and images. So, this benchmark test is actually a test of the strength of the GPU of your smartphone/tablet device. Currently, the Power VR SGX 543 MP2 is the leader in the GPU department and its stamina is reflected in the GL Benchmark tests. Please note that the GL Benchmark test is a combination of several other benchmark tests whose results are reported in 'FPS' (Frames Per Second) and 'mS' (milli Seconds).

Neocore
Neocore is another GPU benchmarking test which benchmarks the Open GL ES-1.1 graphics performance. The results of this benchmark test are reported in FPS. You can download it from here: Neocore for Android

AnTuTu
AnTuTu can run a full test of a key project, through the "Memory Performance", "CPU Integer Performance", "CPU Floating Point Performance", "2D 3D Graphics Performance", "SD card reading/writing speed", "Database IO" performance. A Total score is reported once you run this benchmark. If you want a reference point for your device, "Asus Eee Pad Transformer Prime" with a score of 12872 has the highest score till date.

Vellamo
Vellamo, a benchmark test originally developed by Qualcomm, is a mobile web-browser benchmark that provides a holistic view into browser performance by measuring each component systematically, providing results for CPU and memory, scrolling, JavaScript, HTML 5, canvas rendering speed and network access. So, it is similar to the 'Rightware Browsermark' test.

So, that's some of the benchmark tests that you need to be aware of. Click on the links below to download benchmarking applications or visit sites for testing your smartphone and/or tablet.

-> Linpack
-> JavaScript (Not an Application)
-> Browsermark (Not an Application)
-> GL 2.1
-> Neocore
-> AnTuTu
-> Vellamo

Stay Tuned..!!

Monday, November 28, 2011

Benchmark Tests for Android Devices: Quadrant, Linpack, Nenamark and Sunspider Javascript


A few days ago, we got to know the basics of benchmark tests. Let's see what they are in detail:

Quadrant

'Quadrant' is a product of 'Aurora Softworks'. It is nothing but a series of tests performed on a mobile device which benchmark the CPU, Memory, I/O and 2D/3D graphics. The 'Standard Quadrant' is free for Android Users who can download the application from the 'Android Market' and run it on their devices. The benchmark provides an overall score which can be compared with the benchmark scores from the other devices. If you overclock your CPU, this will obviously get reflected in the benchmark test in a better score. If you want a reference point for your device, the 'Samsung Galaxy Note' has one of the highest, or probably, the highest Quadrant score of 3624 (without overclocking).

Linpack

The Linpack benchmark is a measure of the system's floating point computing power. Introduced by Jack Dongarra, it measures how fast a device can solve a dense N by N system of linear equations. This benchmark was originally designed to run on supercomputers in the 1970's. So, you can imagine how advanced the device in your hand is today. You can download this application from the 'Android Market' and check the strength of the CPU in your device. The results are designated in MFLOPS (Millions of Floating Point Operations per Second).

Nenamark 1

The first iteration of the 'Nenamark' benchmark test was designed to test the strength of the GPU (The above two tests, Quadrant and Linpack, measure the strength of the CPU). Nenamark 1, designed to run around 10-15 fps (frames per second), uses programmable shaders for graphical effects such as reflections, dynamic shadows, parametric surfaces, particles and different light models to push the GPU to the limits. Results are designated in FPS.

Nenamark 2

Nenamark 1 had been released an year ago and since then, the smartphones have become a lot more capable with refresh rates of their screens crossing 60 fps. So, the Nenamark 1 was not deemed very effective to test such devices and hence Nenamark 2 had been conceptualized. Results are designated in FPS.

JavaScript

'Sunspider JavaScript' is a benchmark that aims to measure the JavaScript performance on tasks relevant to the current and near future use of JavaScript in the real world, such as encryption and text manipulation. In other words, the test simulates real-world usage of JavaScript on Websites. The results are reported in milliseconds (ms). If you want a reference point for your device, the 'Samsung Galaxy Nexus' has one of the best scores: 1879 ms.

Just remember that the more you score on 'Quadrant', 'Linpack', 'Nenamark 1' and 'Nenamark 2', the better. The lesser you score on 'JavaScript', the better.

We'll talk about the rest of the benchmark tests in future post. Stay Tuned..!!

Sunday, November 27, 2011

Comparing Ultrabooks: Asus Zenbook, Toshiba Portege, Lenovo Ideapad and Acer Aspire


So, let's discuss the other 'Ultrabooks': The 'Toshiba Portege Z830 / Z835' and 'Lenovo Ideapad U300S'.

Parameter ---------- Toshiba Portege Z830 / Z835 ------------Lenovo Ideapad U300S

Thickness                          0.31 - 0.6 inches                                     0.6 inches

Weight                                     1.12 Kg                                             1.315 Kg

Battery Life                              6 - 7 Hrs                                              6 - 7 Hrs

Storage                                128 GB SSD                                       256 GB SSD

Processor                          Core i7 / i5 / i3                                       Core i7 / i5

Pricing                            Starting From 929$                            Starting From 1095$

Display Panel                        13.3 inches                                          13.3 inches

Resolution                         1366 * 768                                            1366 * 768

Sound                        Waves MaxxAudio 3                     SRS Premium Surround Sound

Connectivity    1 USB 3.0, 1 USB 2.0, HDMI, VGA           1 USB 3.0, 1 USB 2.0, HDMI


The battery-life times that have been quoted here are based on real-time testing done by many enthusiastic tech specialists and not the actual manufacturer quoted battery-life times (which are obviously higher). So, some of the categories in which the ultrabooks slugged it out: 

Best Audio: Asus Zenbook
Bang & Olufsen ICEpower is a Danish research and development company specializing in audio applications. They have worked closely with Asus and have integrated fantastic sound performance inside the Zenbook. Previously, audio performance in one of the Acer's other laptops, the NX90JQ, was also highly appreciated. So, it is a no-brainer that the best performance in the audio category had to go to Asus Zenbook. The audio performance in the other ultrabooks is certainly good for everyday listening, but as I said earlier, the standout performer is the Asus Zenbook.

Best Connectivity: Toshiba Portege Z830
The Portege Z830 just manages to steal the crown in this category from the Asus Zenbook. Both of them come with a similar set of connectivity options. But, the Asus has a mini HDMI and mini VGA connectors (adapters are supplied in the box) and a single USB 3.0 and another single USB 2.0 connectors where-as the Toshiba Portege Z830 has a single USB 3.0, Two USB 2.0, a HDMI and a VGA connectors. So, with an extra USB 2.0 connector and a regular sized HDMI, VGA connectors, which do not require any other adapter to be plugged-in, the Toshiba Portege Z830 out-does the rest of the competition.

Best Price: Acer Aspire S3
The 'Acer Aspire S3' is priced the least among ultrabooks. Although, you get less features when compared to the other ultrabooks, it certainly satisfies all the mandatory eligibility criteria for being an ultrabook. That is enough for the Acer to walk away with the prize in this category.

Best Battery-Life: Toshiba Portege Z830
In real world testing, the 'Portege Z830' has managed to beat the other ultrabooks in battery-life, although not by a huge margin. It has sacrificied performance, which is lower than that of the Ideapad U300S or Asus Zenbook, for achieving this feat. 

Best Display: Asus Zenbook
While all the displays contain the latest technology (being LED back-lit LCDs), the 'Asus Zenbook' offers a better resolution. In fact, it offers 1600 * 900 along with 1366 * 768 resolution. The 1366 * 768 resolution is offered in the 11.6 inch zenbook while the 1600 * 900 resolution is offered in the 13.3 inch zenbook. So, the best-display crown goes to the Zenbook.

Best Touchpad & Keyboard: Lenovo Ideapad U300S
Step aside from the specifications, and you have consider this factor, as there are people who type lengthy reports and spend considerable amount of time resting their palms, typing away to glory. The device has to make sure that the user doesn't feel uncomfortable while spending much time and that the touchpad is neither under-responsive nor over-responsive and this is where the Ideapad U300S scores. 

Best Performance: Asus Zenbook
Yes. The Asus Zenbook offers the best performance of all the ultrabooks with the Ideapad U300S coming in at the second position. The reason for this is that the Zebook contains a high performing SATA III disk that not only makes the laptop boot-up in a very short time but also allows it to transfer data in and out of the device at lightning speeds.

So, there you have it. All the 4 ultrabooks currently in the market compared. If you still think that you don't have enough options, wait for the 'Consumer Electronics Show' in January, 2012. A barrage of new ultrabooks are expected to be showcased there and they will soon make their way into the market.

Thursday, November 24, 2011

Benchmark Tests for Android Devices: Quadrant, Linpack, Nenamark

Of late, you would have been hearing about benchmark tests like ‘Quadrant’, ‘Nenamark’, ‘Linpack’, ‘Browsermark’ etc. being performed on smartphones and tablets and avid tech enthusiasts eagerly comparing how all these latest devices are competing against each other. We’ll explore what these benchmark tests are, the reason for them becoming popular and the various benchmark tests in this article.

  1. What are benchmark tests?
    • In computing, a benchmark is the act of running a set of computer programs for the purpose of assessing the relative performance of an object. So, benchmarking in smartphones and tablets is usually associated with assessing the performance characteristics of the smartphones' and tablets' hardware. But, that doesn't mean there aren't any software benchmark tests. So, please understand that these tests are being performed only to compare the relative performance of the devices and in no way can be used to conclude how smooth or how fast the actual user experience will be.
  2. Why have these benchmark tests become popular for smartphones and tablets?
    • Each and every smartphone and tablet that is being released into the market today has almost the same components: a CPU (Central Processing Unit / Microprocessor / Core), a GPU (Graphics Processing Unit), an Instruction Set, RAM (Random Access Memory), Display containing a particular amount of resolution, Internal Flash Memory etc. Almost all the benchmark tests can run successfully on these devices and gauge the performance of these in-built components. The relevance of some of these tests significantly gains weight when we factor in the ‘Operating System’ that is run on the devices.
  3. What are the various benchmark tests that are relevant to the smartphones and tablets?
    •  The following list of tests are relevant
      •  Quadrant
      • Linpack
      • Nenamark 1
      • Nenamark 2
      •  Javascript
      • Browsermark
      • GL 2.1
      • Neocore
      • AnTuTu
      • Vellamo
We’ll explore in detail what these tests are and how each of them conveys more information about the inner characteristics of the device in future posts.

Monday, November 21, 2011

Comparing Ultrabooks: Acer Aspire S3 and Asus Zenbook

It is an open secret that the success of 'Macbook Air' had prompted Intel to explore the area further which has led to Intel convincing its business partners to explore the category further. Acer and Asus take the opportunity of being the first entrants to this market. Acer came out with its 'Aspire S3' model and Asus has made its launch with the 'Zenbook'. This is how they compare with each other as far as technical specifications are concerned

Parameter -------------------------- Acer Aspire S3 -----------------------------------Asus Zenbook

Thickness                             0.51 - 0.6 inches                               0.1 - 0.7 inches

Weight                                     1.406 Kg                                             1.4 Kg

Battery Life                               4 - 5 Hrs                                            5 - 6 Hrs

Storage                      20 GB SSD + 320 GB HDD                      128 / 256 GB SSD 

Processor                                 Core i5                                           Core i7 / i5 / i3

Pricing                             Starting From 899$                              Starting From 999$

Display Panel                         13.3 inches                                11.6 inches / 13.3 inches

Resolution                            1366 * 768                               1366 * 768 / 1600 * 900

Sound                          Dolby Home Theatre                         Bang & Olufsen ICEpower

Connectivity          Two USB 2.0, HDMI, VGA      1 USB 3.0, 1 USB 2.0, mini HDMI, mini VGA

So, as you can see 'Ultrabooks' have some very good specifications and also offer mostly comparable performance. The specifications list is just to show you the kind of power contained in something so slender and light. There are 2 other ultrabooks that are currently available in the market: Toshiba Portege Z835 and Lenovo Ideapad U 300S. We'll discuss the same parameters for these ultrabooks and then study their reviews to find out which one is the best among them.

Stay Tuned..!!

Sunday, November 20, 2011

Ultrabooks: Features & Specifications

Let's learn about a new category of devices making their way into the market. They are called 'Ultrabooks'. 'Ultrabooks' are primarily a product of the initiative being undertaken by Intel. Intel has put down the specifications that these new devices have to adhere to:

  1. Thickness
    • Less than 20 mm (0.8 inches)
  2. Weight
    • Less than 1.4 Kg
  3. Battery Life
    • Should be at least 5 hours
  4. Storage
    • Use flash-based SSDs (Solid State Drives) instead of HDDs (Hard Disk Drives).
  5. Processor
  6. Pricing
    • Around 1000$ US (50,000 INR)
The above parameters 1,2,3,6 are obvious and self-explanatory. Let's explore the parameters 4 and 5.

Parameter - 4 (Storage)

For the past many years, all Personal Computers have come with Hard-Drives (Hard Disks). The technical definition of a Hard-Disk or HDD (Hard Disk Drive) is as follows:

"HDD is a Non-Volatile Random Access Data Storage Device. It features rotating rigid platters on a motor-driven spindle with a protective enclosure"

Basically what the definition tries to tell you is that there are flat circular disks mounted on top of one another. These are called platters. They store the actual data. The spindle (in the middle of these disks) rotates the platters and the data is read and written with the help of read/write heads. So, when you want to access data, these heads help you. So, the important things to take away is that
  1. The HDDs have moving parts (Platters, Read/Write Heads)
  2. In HDDs, the speed of read/write access is limited by the speed of read/write heads and also the speed at which the platters are rotated
  3. The HDDs are noisy (due to the moving parts)
The Solid State Drives (SSDs) overcome the limitations and problems that are traditionally faced with the HDDs.
  1. SSDs do not have any moving parts. Hence they generate no noise at all.
  2. The read/write performance is dramatically enhanced. For example, 'Ultrabooks' having SSDs can boot up an operating system within seconds and transfer data between them and other devices at very fast speeds as compared to the HDDs.
  3. They produce very little heat and consume very little power as compared to the HDDs.
Parameter - 5 (Processor)

As you all might be knowing, Intel's latest processors are 2nd Generation (Sandy Bridge) Core i3, Core i5, Core i7 processors. These processors are built for normal laptops whose performance and power requirements are entirely different from that of the 'Ultrabooks'. 'Ultrabooks', being very thin and very light, needed a separate computing platform which can make them deliver a better battery life while not overly compromising on performance. To meet such requirements, Intel has developed the CULV platform which uses less power and gives similar or better performance as compared to the traditional processors.

Many manufacturers including Asus, Lenovo, Toshiba and Acer have dived into this category. Expect a slew of 'Ultrabooks' to be announced by other leading manufacturers at next year's 'Consumer Electronics Show (CES)' scheduled to take place in January. We'll discuss more about the 'Ultrabooks' already released into the market in future posts.

Stay Tuned..!! 

Saturday, November 19, 2011

Milking the Galaxy: Journey from Galaxy S to Galaxy S II


I bet that reading the title would have made your mind wander off to the ‘Milky Way’ galaxy in which our planet Earth is a teeny-tiny part. Well, let me bring you back to it. I am talking about the ‘Galaxy’ range of smartphones and how Samsung’s marketing and operation teams worked together in a cohesive fashion to generate mind boggling revenues for the company.

It all started when Samsung announced the ‘Galaxy S II’ smartphone at ‘Mobile World Congress’ at Barcelona in February 2011. Samsung declared that the smartphone would be released in the world markets starting from its homeland in April and yes, true to its word, the ‘Galaxy S II’ has been released in almost every major world market till date and it has been a resounding success everywhere.

Almost all of you know that the smartphone has redoubtable specifications and has gained the approval and praise of almost every tech enthusiast in the world. But, if you view the whole ‘Galaxy S II’ exercise from the business point of view, you will get to know the real thing.

Understand that there are two technologies that are most prevalent in the market today: GSM and CDMA. While the GSM versions of the phones work on different carriers by just changing the SIM card, the CDMA versions are locked down to a specific carrier. Samsung originally released the ‘Galaxy S II’ bearing the model number GT-I9100 with the ‘Android’ operating system, a GSM version. On May 9, 2011 they announced that they had received 3 million pre-orders for the smartphone.

Samsung was very quick to foresee that it wouldn’t be able to meet the huge cumulative demand from all the markets. There is a very good reason for it. As you would have already got from my previous posts, The ‘Galaxy S II’ uses two important components manufactured by Samsung itself. The display ‘Super AMOLED plus’ is manufactured by Samsung and the SoC (System on Chip) i.e. Exynos 4210 is also manufactured by Samsung. So, Samsung had taken the help of Nvidia. It launched a Tegra-2 powered version (GT-I9103) of the ‘Galaxy S II’ and named it ‘Galaxy R’ also called the ‘Galaxy Z’ in Sweden. This European version of the ‘Galaxy S II’ didn’t get a ‘Super AMOLED plus’ screen and instead settled for an ‘SC-LCD’ display and a reduced 5 megapixel camera instead of the 8 megapixel camera that the original came with. But it was also priced lower as compared to the ‘Galaxy S II’ GT-I9100 version.

In order to compete well in the relatively lesser priced smartphone section, Samsung released another version called ‘Samsung Galaxy W’. This doesn’t have a dual-core processor but instead settles for a single core 1.4 GHz Qualcomm processor with an ‘SC-LCD’ screen and a 5 megapixel camera. It was also priced lower as compared to the original ‘Galaxy S II’ version in order to compete well.

In many of the world markets, 4G technologies had already been in place and Samsung also saw an opportunity there. The originally released ‘Galaxy S II’ did not have LTE (Long Term Evolution – A 4G Technology) baked into it. So Samsung released a variant of the ‘Galaxy S II’ called ‘Galaxy S II LTE’ and ‘Galaxy S II HD LTE’ on August 28, 2011. While the ‘Galaxy S II LTE’ has a 4.5 inch Super AMOLED display, the ‘Galaxy S II HD LTE’ sported a 4.65 inch SuperAMOLED 720p display.

Samsung saved the Canadian and the American markets for the last. In America, Carriers (Service Providers) tie up with mobile companies and sell CDMA versions at subsidized rates. Normally the contracts have duration of 2 years after which, the user gains ownership of the phone. Samsung tied up with ‘Sprint’, ‘T-Mobile’ and ‘AT&T’ carriers and offered ‘SPH D710’, ‘SGH T989’ and ‘SGH I777’ respectively. All of these are variants of the ‘Galaxy S II’ smartphone. The ‘T-Mobile’ version was released with a 1.5 GHz dual-core Qualcomm processor instead of the Exynos CPU as the Exynos CPU wasn’t compatible with T-Mobile’s network. There are other differences between the models but they are minor. Samsung was able to bring the entire ‘United States of America’ under its purview with this release strategy. Currently, Samsung is also selling the ‘Galaxy S II Skyrocket’ for ‘AT&T’ (SGH-I727) which is an improvement over the original version that the carrier originally got. Similar to the strategy followed in the US, Samsung also hooked up with carriers like ‘Rogers’, ‘Bell’ and ‘Telus’ to cover the Canadian market. Samsung also launched a ‘Texas Instruments OMAP 4430 SoC’ powered ‘Galaxy S II’ which most people feel might be due to a deficiency of the ‘Exynos 4210’ SoCs. The ‘OMAP 4430’ chip offers a relatively similar performance as compared to the ‘Exynos 4210’ chip.

The point to be noted is that, Samsung has successfully been able to penetrate all the major world markets with the myriad variants of the ‘Galaxy S II’ smartphone while not compromising majorly on any of the original characteristics that had made the phone such a popular device. If we consider the BCG matrix, when the ‘Galaxy S II’ was released in April, it was a ‘STAR’ and it has now become a ‘CASH COW’ which Samsung is looking to milk. This also reaffirms the relevancy of a popular proverb: ‘Make hay while the sun shines’. J

As we approach the end of another eventful year, one of the Samsung representatives has said that the world will witness the successor to the ‘Galaxy S II’ which, as you might have already guessed by now, is the Galaxy S III’ (to be released at Mobile World Congress at Barcelona in 2012). Samsung has surely had a hell of a lot of fun with the ‘Galaxy S II’ this year. Hopefully it will deliver again with the ‘Galaxy S III'.

Stay Tuned..!!

Friday, November 18, 2011

Gorilla Glass, Wi-Fi Direct, DLNA: Smartphone Features

Let's get to know about some more terms today.


  1. GorillaGlass
    • In order to provide a tough cover glass for electronic devices, which not only provides damage resistance but also has a thin form factor, ‘Corning’, a world leader in specialty glass and ceramics, developed ‘Gorilla Glass’. It is an alkali-aluminosilicate thin sheet glass engineered specifically to be thinner, lighter and more damage resistant. Today, most of the high-end smartphones use this glass to shield their screens from scratches while not hampering the quality of the displays. 
  2. Wi-Fi Direct
    • Wi-Fi Direct’ is a technology that allows Wi-Fi devices to talk to each other without the need for any wireless access points (hotspots). Recently, there was news that 'Wi-Fi Direct' has been included as part of DLNA specifications.
  3. DLNA
    • DLNA or Digital Living Network Alliance is a non-profit collaborative trade organization established by Sony in June 2003, and has more than 250 member companies in the mobile, consumer electronics, and PC and service provider industries. Many electronic devices including TVs, Smartphones, and Portable Media Players are ‘DLNA Certified’ which means that they can interconnect with each other effortlessly. For example, if your TV and smartphone are ‘DLNA Certified’, then you can display photos from your smartphone on your Television.
Stay Tuned..!!

Thursday, November 17, 2011

Samsung & Apple: Are they Competitors or Co-Operators?


Do you know what has been powering Apple’s iPhones, ranging from the original iPhone released in 2007 to the latest iPhone 4S in 2011: SoCs manufactured by Samsung. Not only for the iPhone, but also for Apple’s Macbooks, Samsung is one of the most critical component supplier as it supplies parts like DRAM (Dynamic Random Access Memory) module and SSDs (Solid State Drives). But do you know who has been suing ‘Samsung’ all over the world for the past few months: Apple. So, are they Competitors or Co-operators?? Let's find out.

Before Samsung, Apple already had a long list of lawsuits filed against Android device manufacturers like Motorola (who actually was the first to file a lawsuit against Apple), HTC etc. as it believed that the technologies being used by the devices seemed to infringe a number of Apple patents.

I believe that the actual fillip for Apple to file a patent lawsuit came when Samsung came out with the ‘Samsung Galaxy S2’ which was touted by the reviewers of many gadget websites, as the best Android Smartphone available at that point of time and they foresaw that it would sell in millions. Also, many tech enthusiasts believed that the Galaxy Tab 10.1 is one of the best Android tablets that have come closest to dethroning iPad in the tablet category.

Apple had a filed a host of patent infringement cases: 7 utility patents, 3 design patents, several iOS system application icons case and a host of trade dress registrations and packaging that each of Samsung's devices come with. I am not going to detail what each and every patent infringement case was but some of the bottom-line things that you need to take note of are as follows:
  • Apple believed that Samsung’s Galaxy range of phones like the Galaxy S 4G and the Nexus S were awefully similar looking to the iPhone.
  • The way Samsung had modified the vanilla Android OS with its ‘TouchWiz’ skin to display application icons also had an overwhelming resemblance to the way Apple was displaying icons on both the iPhone and the iPad.
  • Apple also had a problem with the way Samsung had packaged its phones. It said that the packaging was also very similar to the way in which its iPhones were packaged.
On a funnier note, there was a recent incident where a Judge had held the iPad 2 (Apples’ tablet) in one hand and a Galaxy Tab 10.1 (Samsung’s tablet) in the other and asked Samsung’s group of lawyers to distinguish both the devices from a distance of 10 feet away. Samsung’s lawyers did pick their own device correctly but not before an aweful lot of time had passed. This shows that there is a lot of similarity between the iPad 2 and the Galaxy Tab 10.1 and when you have a look at their external design and specifications it is not hard to fathom why. This might be the reason why Apple has been able to get a temporary injunction on Samsung selling the Galaxy Tab 10.1 in Australia and Galaxy Tab 7.7 in Germany.

Samsung has also filed some lawsuits against Apple citing that Apple’s iPhone violates patents related to wireless communication: 3G. Samsung claims that Apple can’t sell a 3G compatible device without using Samsung’s Technology.

The irony is that even as Apple and Samsung are trading blows against each other, Apple has reportedly awarded Samsung the contract to supply the next-gen Quad-Core A6 CPUs which are most probably going to be used in iPhone 5 (or whatever the next generation iPhone is going to be called) and the iPad 3. Apple had reportedly considered other options like the ‘Taiwan Semiconductor Manufacturing Company’ (TSMC) but had found that TSMC hasn’t quite stabilized its foundry good enough to manufacture the A6 CPU.

So, are they Competitors or Co-Operators? The Question still remains..!!

Stay Tuned..!!

Wednesday, November 16, 2011

Accelerometer, Proximity Sensor, HDMI Mirroring: Smartphone Features

Today, let's get to know the various features that today's smartphones come bundled with.

  1. Proximity Sensor
    • A Proximity Sensor is a sensor that is able to detect the presence of nearby objects without any physical contact. This is possible through emission of an electromagnetic radiation while looking for change in the field or return signal. In some of the HTC phones, proximity sensor is being used very well. When the phone rings and you turn the phone over, it automatically goes into a silent mode. The other famous application of ‘Proximity Sensor’ is in car bumpers where the distance to nearby cars is sensed in order to facilitate parking.
  2. Accelerometer
    • In the context of mobile phones, accelerometer was primarily used as a tilt-sensor for tagging the orientation of photos taken with the built-in camera. Thankfully, it has been extended to other applications like messaging where tilting the phone changes the layout of the keyboard from either landscape to portrait orientation or portrait to landscape orientation. The presence of an accelerometer also allows people to play games with amazing ease especially in racing games where tilting the mobile phone can make cars move from right to left or left to the right.
  3. HDMI Mirroring
    • Most smartphones today come with an HDMI port or mini-HDMI port or a micro-USB port. An HDMI cable is used to connect the smartphone and the HDTV (Most of the HDTVs being manufactured today also come inbuilt with an HDMI port). After successfully connecting the smartphone you can view content (like games, video) that is being played on the smartphone on the big-screen TV. This feature is called HDMI Mirroring. If the phone has a micro-USB port and doesn't have an HDMI port, an MHL (Mobile High-definition Link) adapter can be used to connect the smartphone and the HDTV.
Stay Tuned..!!


Tuesday, November 15, 2011

The Truth about the 'Core'


So, do you absolutely have to have a dual-core phone to enjoy the best experience on a smartphone? The question might be a one-liner but the answer is not quite simple.

You need to take note of the fact that most of the dual-core phones that are being launched in the market right now are actually running the same operating system: Android. Is it a coincidence?? Probably Not. For knowing the reason, you need to get to know about the companies a little deeply.

Most of you opine that Apple’s phones are cool. I know many of you use it for the kind of oohs and aahs that it draws from the surrounding crowd rather than truly knowing why you need it. But the kind of image that Apple has developed in the minds of consumers has a good grounding.

Apple is responsible for both the components of its smartphone and tablet devices: The ‘Hardware’ and the ‘Software’. Yes!! Apple manufactures its own hardware (Though it procures individual components from other manufacturers, it is consequently responsible for its own hardware) and also software: the iOS operating system. The software isn’t Open-Source. It is Proprietary and only controlled by Apple. So, Apple, in control of both the components that make up its smartphones, knows how to design its software to complement its hardware or how to design its hardware in order to complement its software. So, it is able to offer a complete package and hence you can feel that in all of its phones. Usually, you can see zero lag while operating Apple’s phones as the software is optimized for the underlying hardware and that hasn’t warranted a more powerful smartphone. Hence you did not find a dual-core processor in an Apple phone (until iPhone 4S) when other manufacturers were going hammer and tongs releasing dual-core smartphones. In summary, you have only 1 kind of software i.e. iOS and it only runs on one kind of hardware i.e. Apple’s devices. This approach works for Apple as they have full control over the final output. They have generated, are generating, and will generate huge profits with such phones but the market reach is going to be less.

Now, take the example of Android. It is an open-source operating system and the stock-build of Android is delivered by Google. Google is only responsible for the software part. Android is an open-source operating system and manufacturers are free to customize ‘Android’ to suit their own phones. Samsung has its own skin called ‘Touchwiz’, HTC has its own skin called ‘Sense’ and Motorola has its own skin called ‘Motoblur’ and so on. These different flavors of android are not being optimized extensively for the underlying hardware. In fact, ‘Froyo’ (Android 2.2) and early versions of ‘Gingerbread’ (Android 2.3) don’t even recognize the true power of dual-core smartphones. It is only the latest version: Android 2.3.5 which can actually take advantage of dual-core power being offered by the smartphones. The other build of Android: ‘Honeycomb’ (Android 3.0) also supports dual-core but that operating system is for tablets. The latest build of Android: ‘Ice Cream Sandwich’ (for both smartphones and tablets) will hopefully have plenty of dual-core optimizations.

But, there are problems for ‘Android’ which is actually making it compulsory for manufacturers to make dual-core smartphones:
  • The software is being run on low-power hardware (some low-powered smartphones) which is allowing lag to creep in.
  • Very few manufacturers are using the vanilla version that Google releases and developing their own versions with heavy customization which, according to most of the experts, is actually slowing down the performance of ‘Android’. This is also leading to the problem of ‘Fragmentation’ due to which ‘Android’ is not presenting a consistent user-interface across all the smartphone devices.

In summary, there are several versions of the software: Various ‘Android’ flavors and there are several kinds of hardware: Various smartphone manufacturers. This is not to say that such phones are bad. Manufacturers are releasing their own flavors of android because they are adding extra features to make their phones unique. For example, you can find beautiful 3D transition effects in ‘HTC Sense’ that is unavailable in stock-builds that Google releases. In fact, such beautiful effects are even absent in the iPhone. But, it is just that with such a whole lot of varieties of hardware and software, ‘Android’ is not able to actually utilize the single-cores fully which is warranting manufacturers to go for dual-core smartphones.

Similar to the iOS, Microsoft’s OS for mobiles:  Windows Phone, also doesn't mandate a dual-core smartphone. Microsoft is not exactly responsible for the hardware on which it runs its software but it has stringent requirements for the underlying hardware and all the companies which use this OS have to adhere to the requirements. It also has designed its OS in such a way that the OS does not necessitate a dual-core smartphone for providing a high-quality end user experience. Eventually, Windows Phone will also get dual-core processors but that will be mostly for providing extra features.

In all the mobile operating systems, ‘Applications’ or ‘Apps’ as they are being popularly called, are the core. Absence of applications is what led to the downfall of ‘Web OS’ which is another great operating system. Thankfully, developers are also realizing the fact and they are developing games that can actually take advantage of the dual-core power of the smartphones. So, you won’t be able to play HD games on your single-core smartphones as they are being designed keeping in mind the raw power of dual-cores. Also, features such as shooting 1080p video @ 30 fps and the ability to connect smartphones to HD Televisions and play 1080p content are also only available on dual-core smartphones.

So, if you are going to use a mobile phone and do not value ‘Geeky’ features like those listed above, single-core smartphones will be more than enough to satisfy all your wants. If you want proof, you can check out the latest Nokia Windows Smartphone: The ‘Lumia 800’. It is powered only by a single-core 1.4 Ghz Qualcomm processor but it has a very-fluid and gorgeous user interface. But, if you do buy a dual-core Android smartphone, you know that it is not going to go waste. J

Stay Tuned!!!

Monday, November 14, 2011

Quad-Core Smartphones: Exynos 4212, OMAP 4470 details

If you have been following my blog closely, I said that I would discuss the 'OMAP 4470' in another post in conjunction with the 'Nvidia Tegra 3' (in 'The 'Core' Enigma (Contd.) - 1' post). As we are done with the 'Tegra 3' let's now get down to the 'OMAP 4470' processor.

As you already know by now, I had mentioned that 'Texas Instruments' manufactures 3 SoCs: OMAP 4430, OMAP 4460, and OMAP 4470. Well, it actually manufactures another SoC: OMAP 4440, but it is only a minor upgrade from the OMAP 4430. So, why is 'OMAP 4470' special??

The 'OMAP 4470' uses an architecture which is very similar to the one found on the 'Tegra 3'. The 'OMAP 4470' is actually a dual-core processor (2 ARM Cortex A9s) but in addition to these 2 cores, it contains 2 other cores (2 ARM Cortex M3s). While the Cortex A9s are clocked at a maximum frequency of 1.8 Ghz, the Cortex M3s are clocked at a maximum frequency of 266 Mhz. The 'Cortex M3s' kick-in when less CPU intensive tasks are running like 'Playing Music', 'Editing a document' or 'Checking E-Mail'. But once you start playing 'HD games' on the smartphone device (inside which the SoC is present) the 'Cortex A9s' make their presence felt. Very Impressive architecture indeed..!!! :D

As far as the GPU is concerned, Texas Instruments went with the leader in the GPU space: 'Imagination Technologies'. I have already mentioned that currently 'Imagination Technologies' produces one of the best GPUs in the market. The GPU inside the 'OMAP 4470' is 'Power VR SGX 544' (Surprisingly this is a single-core GPU). Supposedly, it offers tremendous improvement in the GPU performance as compared to the other OMAP SoCs. Now as in the CPU, there is an additional 2D graphics core that kicks-in when less graphic intensive tasks are performed, it is only when applications which demand better graphics performance are run does the 'Power VR SGX 544' actually kick-in. So, there is a by-pass for the GPU too. Wonder what all crazy steps are being taken by SoC makers for saving battery life. :-|

Anyways, the 'OMAP 4470' is actually being touted as a competitor to the 'Tegra 3' and given it's architecture it's hard to conclude why it won't offer performance similar to the 'Tegra 3'. And don't worry Samsung is out to compete with both of them with the 'Exynos 4212' and Qualcomm won't be far behind in its development of 'Snapdragon S4'.

With this post, I will be shutting up on CPU and GPU :). In the next post, I will tell you whether you absolutely need to have a 'Dual-Core' smartphone to enjoy it's features completely.

Quad-Core Smartphones: Nvidia Tegra 3 Details


So, we have talked about ‘Single-Core’ and ‘Dual-Core’ smartphones and tablets. You might be thinking ‘These are soooo Yesterday!! Tell us something that we don’t know’. Well, I’ll oblige by telling you that 2012 will be the year of ‘Quad-Core’. Yes!! You heard it right!!! Most of the market leading SoC manufacturers will come out with their ‘Quad-Core’ creations in the next year and these SoCs will have performance that is multiple times their previous generations’.

If you would have followed this blog closely, you might have seen the mention of two devices: the ‘Asus Eee Pad Transformer Prime’ and the ‘HTC Edge’. Both will be sporting an ‘Nvidia Tegra 3’ (Project Kal-El) CPU which is, as you rightly guessed, a quad-core processor. I have also highlighted about the display panels used in both of them in ‘The Display Dilemma’ blog posts below: a ‘Super IPS+’ panel in the ‘Transformer Prime’ where as an ‘SLCD-2 optically laminated display’ in the ‘HTC Edge’.  So, they both have industry-leading displays and also processors. So expect these to get sold as fast as cakes. The ‘Transformer Prime’ will be hitting stores next month and the ‘HTC Edge’ will make its presence in Q1 2012.

So, let’s get down to the actual dirt: the ‘Nvidia Tegra 3’. The ‘Nvidia Tegra 3’ is, as you rightly expected, a successor to the highly successful ‘Nvidia Tegra 2’ and ‘Tegra 3’ has received a significant performance boost over the ‘Tegra 2’. Nvidia has detailed that the Tegra 3’s processor is 5 times as fast as the Tegra 2’s and the Tegra 3’s GPU is 3 times as fast as the Tegra 2’s. So, let’s find out how this is possible!!

Tegra 3 has got 4 ARM Cortex A9s (a step up from the 2 ARM Cortex A9s in Tegra 2) and therefore has got its name as the ‘Quad-Core’ SoC.  Nvidia actually had a surprise for the general audience when it announced the ‘Tegra 3’ SoC. It not only had the above mentioned 4 ARM Cortex A9s (which can run at a maximum clock-speed of 1.3 Ghz when running simultaneously and at a maximum clock-speed of 1.4 Ghz when only one of them is active), but also had another ARM Cortex A9 clocked at a lower frequency of 500 Mhz. So actually, ‘Tegra 3’ has got not 4 but 5 cores. The reason for including an additional ‘Companion Core’ (ARM Cortex A9 at 500 Mhz) was simple. Normally, when you increase the cores on the SoC, it tends to draw in more power which leads to loss of battery life. Most basic tasks like ‘Playing Music’, ‘Editing a document’, ‘Checking e-mails’ etc. do not require the power delivered by 4 cores. So utilization of 4 cores in such scenarios would be pointless. Nvidia thought that they could take care of these tasks with the ‘Companion Core’ and let the other ‘Quad Cores’ come into picture only when the applications demanded it. For example, if you are trying to watch a HD video or trying to play a game on your TV by connecting your smartphone or tablet to it, the performance of ‘Quad Cores’ is required. So in such scenarios, the ‘Companion Core’ would quietly hand over the control to the other 4 cores. So, as you would have got by now, at any point of time only one of them would be active: either the ‘Companion Core’ or ‘The other Quad Cores’. This type of functioning has been branded ‘Variable Symmetric Multi Processing (vSMP)’ by Nvidia. I remember using such terms to impress my professors while taking exams in MBA. ;-)

Right. The other major difference between the Quad-Core processor cores in Tegra 3 and the Dual-Core processor cores in Tegra 2 is that each of the cores (in the Quad-Core architecture and not the Companion Core) is equipped with a ‘Media Processing Engine’ (MPE) part. This MPE part is absent in the dual-core CPUs of Tegra 2. This MPE not only will allow users to watch any video in any format but also will support voice recognition.

Nvidia claims that it has been able to significantly improve the battery life of the smartphone or the tablet using such architecture. The other important thing to be noted is that even the GPU has 12 cores. But the actual meaning of ‘Core’ in a GPU is different from that of a ‘Core’ in a CPU. The only thing to be remembered is that the GPU in ‘Tegra 3’ is clocked at a higher frequency and delivers more performance than the GPU in ‘Tegra 2’, which will allow you to play games, that necessitated a graphic card in your laptop or desktop, with consummate ease.

So that is it about the CPU and the GPU of Tegra 3 SoC. At the time of writing this post, I have also got to know that 'Lenovo' and 'Acer' also planning to come out with their Tegra 3 offerings very soon. Looks like 'The Transformer Prime' won't be alone after all. Stay tuned..!!!