Sharp’s press conference has just finished at CES 2013 and within it was one of the important technologies of 2013 when it comes to mobile devices and power consumption. IGZO.
As CPU, GPU and comms power requirements fall, the screen becomes a very big, power-hungry issue. IGZO screen technology could really help extend in-use battery life and that’s why we’re watching it closely.
The adoption of Micro-USB as a charging port standard on smartphones was a breakthrough for the consumer but we’re still left with the same problem on our other portable electronics. Example: I just spent 10 minutes searching for a power adaptor for a Tablet here in the office. What if we could do the same for portable electronics as we did for smartphones? Power over USB (or USB Power Delivery as it’s officially known) looks like it could be the answer. Read the full story
We’ve been banned from photography and video here at the first Haswell tech talk, IDF 2012 which means there’s bound to be some interesting slides up on screen. Here are my notes from the session – these focus on very deep features of the architecture.
Intel have just presented details of a Wireless Charging solution for Ultrabooks at IDF Beijing. They see a big potential market and look to have made good progress on a hard/soft solution that takes into account power considerations and regulatory requirements. Patents have been applied for!
Early in January I put forward an article which highlighted the differences between the ‘ultra low voltage’ CPUs you get in Ultrabooks and the ‘low voltage’ CPUs you get in many laptops. I gave some comparison figures for two devices in different usage scenarios by measuring ‘system’ power drain and it was only in the high-end tests where we saw the ULV processor being significantly more efficient. In this article I continue the testing and compare the LV and ULV cores directly. The results are blow.
Measuring ‘system’ drain on two different systems isn’t the most scientific of tests so a discussion broke out in the comments about how we could measure a true difference in efficiency between ULV and LV processors and whether it could be possible to run low-voltage processors at slower clockrates and get the same efficiency as a ULV processor.
The theory says ‘No.’ If you run a CPU at the same frequency but with a higher voltage, the power usage goes up.
Giving battery life figures for any modern notebook will always be a difficult task due to the huge dynamic range of the mobile platforms used in notebooks today. Take the Samsung 900X1B I’m testing at the moment. It’s not an Ultrabook but it’s built on the same platform to the same dimensions and it will run at anything between 2.8W drain and over 10X that figure. In this article I’ll give you some results from some fairly detailed testing I completed today. In summary, the Samsung 900X1B has excellent battery life for the size, weight and battery capacity. It’s efficient.
The Samsung 900X1B runs on a Intel Core i3 2357M platform. It has a nominal 1.3Ghz clock speed but can speed-step down to 800Mhz. There’s an Intel HD3000 graphics unit, a video encode/decode unit and, of course, lots of busses, components and connectivity. There’s also a 11.6″ 1366×768 screen with a relatively powerful back-light. The battery capacity is 42Wh [I recently corrected our database which showed in incorrect 40Wh.]
The tests I did today were aimed at finding how low the platform can idle and what sort of drain you can expect some of the components to add, all the way up to maximum power usage.
The two pdfs focus on the benefits of high performance graphics and multiple cores in mobile computing. While I’m yet to be convinced that I need 1080p decoding and gaming graphics on my mobile computer, I do see that improved user interfaces and acceleration of some elements of the web page and web application process is beneficial. After reading the reports I’ve also come away with positive thoughts about multicore computing as a way to save battery life. The theory is simple – high clockrates need higher voltages and more power in exponentially rising amounts and so running two cores at a lower clock to complete the same task can result in power savings.
In podcast 63 at Meetmobility, Al Sutton of Funky Android, an Android consulting company, highlighted why he thought Honeycomb would appear on phones. His theory is based on the fact that Honeycomb is the first version of Android to be built with multicore platforms in mind and the supephones will therefore benefit. The Dalvik environment that applications run in is multicore-aware and will attempt to use multiple cores to speed up (and lower the power cost) of jobs that run in it. That feature alone could help every application running on Android without any programming changes in the application. With smartphones heading in the multicore direction, Honeycomb brings advantages and unless there’s a new multicore aware version in the 2.x branch, Honeycomb could be the way to go for multicore smartphones.
So why don’t silicon experts Intel use multiple cores in their Moorestown platform? The platform runs up to 1.8Ghz I understand so wouldn’t it be better to run 2 cores at, say, 1Ghz? Cost of silicon, size and complexity are probably in the equation and there’s probably a marketing advantage in using a higher clockrate but you would think that if this theory of more cores x lower clock=less power is true, Intel would be doing it too considering how badly they want to get into smartphones. Perhaps it is because much of the software out there isn’t truly multi-threading enabled and the advantages are limited. Where a program runs on multiple cores at a lower clockrate but only utilises one it means that the operation takes longer to run and the system can’t get into an idle state as quickly and the power used is way higher. Just leaving a wifi and screen on for a small extra time will negate any potential advantage.
It’s complex stuff but my feeling right now is that multiple cores are going to bring advantages. We’ll see, in time, if the Honeycomb-for-superphones theory is correct and we’ll see if Intel goes that route for Moorestown and Medfield too.
For the last two nights I’ve been testing the ‘standby’ battery life on the Toshiba AC100. [Unboxing and overview video here] On the first test the battery was at about 30% capacity. I closed the lid and expected to have plenty of battery life left in the morning. When I woke up, the AC100 was dead. On the second test the battery was again at 30%. This time I turned the WiFi off before closing the lid. In that scenario I’d expect next to no drain at all. Again, when I woke up 7 hours later, the device was dead. Something’s wrong.
Looking at the battery information I’m seeing something strange.
Can you spot the issue on my WiFi-only AC100?
Yup, somehow the 3G subsystem is draining power which is really quite strange considering I don’t have 3G on this device. Have they left the 3G radio on the device and just removed the SIM slot? Have they forgotten to turn the 3G off in the firmware? Does ‘cell standby’ actually mean something else? I can’t imagine another subsystem in the AC100 that would take more power than the screen and Wifi. On my Android phone here, cell standby is taking only 9% of the power. When the firmware contains strings like ‘eng/test-keys’, commonly found on test builds, you’ve got to wonder what’s going on.
I’ll have to raise a support issue on this with Toshiba Europe.
Note: 12mins later, the graph was still the same. Cell Standby is taking 77% of the battery drain. Going to ‘airplane mode’ doesn’t appear to help.
Note: 30 mins later and ‘cell standby’ is up to 81%.
Anyone else experiencing the same on their AC100 (Is there anyone else out there with an AC100?)
Update: Just to be clear – active battery life is around the 6hrs mark (50% screen, wifi on) so there’s no problem with that. I’ve also found a lot of threads on forums that question the ‘cell standby’ measurement. One response says it’s a known issue in Android 2.1. Currently manually measuring screen-off drain.
In a third test last night I went to bed with about 60% drain. I woke up with 20% left – and the screen on. Something is turning the screen on and causing the drain. Have now done a factory reset to remove any of my sideloaded apps that may be turning the screen on. I’ll do another overnight test tonight.
Update: 1535 -Â 31 August.
With a fresh factory reset I’ve been testing the battery life over the last few hours.
With screen off, wifi on, idle, no usb subsystem, no sdcard i’m seeing 6 mins per 1% battery drain. That’s really not that good. â€“ 2.4W average drain. I’m expecting more like 1W.
With screen off, AIRPLANE MODE, no USB subsystem, no sdcard, idle, I’m seeing 13 mins per 1% battery drain. That’s 1.14W drain which is terrible for an ARM system. A smartphone with screen off and airplane mode would take about 20-50mw. Remember, the AC100 is effectively has smartphone internals so when you turn the screen off, there should be no difference (i’ve turned the USB host subsystem off and removed the SDcard to remove that from the equation.)Â Something is sapping over 90% of the battery â€“ which brings us back to the cell subsystem which, after these tests, was taking 84% of the power according to ‘battery status’ under Android.Â At this stage, i’m tempted to pull it apart. Will I find a surprise 3G module inside?
Update2 – 31st August.
I won’t be doing any more review work on the AC100 until I get to the bottom of this power issue because it’s a huge problem that takes away the main reason to have it in the first place. ARM-based devices do a good job at ‘always on’. Take the Archos 5 for example. It’s a Cortex-based Android device and just 30 minutes ago I checked some stats on it. It’s been sitting on my desk in a screen-off, wifi-off state for 4 days and 8 hours and get this, it has a battery that’s less than half the size of the battery in the AC100. Not only that, there’s 45% of the battery left. That’s under 50mw of drain. 20x less than the AC100. There’s the problem with the AC100!
Update 3: 5th Sept.
Toshiba Germany tell us that Froyo will be delivered in 6 weeks (Mid October) for the AC100. We have also reported the details of the above issue directly to the German product manager.
If you’re into the discreet component level of MIDs, you might want to check out this press release from NEC.
NEC Electronics America, Inc. today announced that it will highlight its power management IC (PMIC) technology at the International Consumer Electronics Show (CESÂ®), January 8 â€“ 11, 2009. The live demonstration will feature NEC Electronics’ PMIC solution for Intel’s next-generation Mobile Internet Device (MID) platform, codenamed Moorestown.
NEC were involved with the power management IC’s on Menlow so this is nothing new for them. For us, it might be a chance to see a Moorestown demo if someone can get themselves over to the NEC booth and register for a private demo.
I recall seeing this solution a while back on JKKMobile but Paul from MoDaCo has now got one for testing. Its a 47Wh battery pack for the HTC Shift from Mugen which should allow about 5hrs online time thus solving one of the HTC Shift’s biggest problems. It’s a tidy solution but at nearly $240, it’s a huge amount to pay for 5 hrs of computing time.