2304 x 1440

The rumored 12″ MacBook retina with newfangled trackpad and “fan less” design is a rumor as exciting as the iPad mini. But there’s one thing I’m questioning about the reports: the 2304 x 1440 resolution. That would be 1152 x 720 logical pixels, and Apple’s smallest display in recent history. Even the original PowerBook G4 was 1152 x 768. Only 1024 x 768 displays have less room on them. The current panel used for the 15 and 13 retina models is the same dpi, and extrapolating the same dpi (they’re all in the 220-ish range), this proposed resolution would be too.

The non-retina 11″ Air currently actually has the highest non-retina dpi of any of Apple’s products, and doubling that from 132 to 264 still would be far below iPhone / iPad mini territory (although I would love a laptop at that dpi too). This wouldn’t account for the increase in size from 11 to 12 inches, which is especially interesting considering just how much bezel the 11″ has today. Dialing up from there just a tad to a dpi of 283 could give the same 2880 x 1800 resolution as the 15″, which would be definitely eek this product into “pro” territory, and most importantly offer “looks like 4K” as a scaling mode, which would be nice since my 20/10 eyes have no trouble with 1080p points at 12″. 3840 mode on the 15″ is just around 300dpi, so 283 would still be lower than that virtual resolution.

So, predictions from most to least likely for resolution:

  • 2x1280x800 = 2560×1600
  • 2x1440x900 = 2880×1800

I think the extra “inch” will be used to make the laptop taller and 16:10 instead of 16:9. I don’t expect any resolution we haven’t seen before because the math works out so well with the scaled modes we currently have.

My original 11″ Air and original 15″ retina are both fine machines in perfect working order, but I will continue to dream of a machine that combines the best of both worlds (the retina 13″ is what I’d call a ‘compromise’). I hope I’m not too disappointed this fall.

HDMI Dongles

As Amazon and Apple prepare new tv boxes that likely just spit out HDMI, there’s of course speculation about their form factor being an HDMI stick like the Chromecast.

There’s a few problems with this concept and this form factor in general:

  • Power over HDMI isn’t here yet, so an ugly USB or other additional power cable is necessary
  • Being behind the TV requires Bluetooth control. While IR is awful for many valid reasons, at least it doesn’t need to be paired. And yes, I do expect a real remote from an Apple device. The 6-way remote is what makes the AppleTV easier than a Mac Mini or an iOS device hooked up via HDMI.
  • Too easy to not fit. The original iPod Shuffle blocked ports or didn’t stick far enough into recessed USB ports. If the device isn’t too fat to be a problem, and the HDMI male end sticks out far enough for all TVs, it could still risk being too long. I have sets that barely fit the old Monoprice HDMI cables with ferrite cores on them (and those are flexible).
  • Apple likes their products to be seen. They want your visitors to see the black puck (in a prominent location because it needs to receive IR) and ask what it is. It can’t do that behind the TV.
  • Apple doesn’t need to shrink the puck so much that they can’t pack an A7X into it.

I’m still of the belief that the reason MFI Bluetooth controllers of the extended layout require player 1-4 LEDs is because games are coming to the AppleTV. At retina iPad resolution, the A7X is already driving games at sizes greater than 1080p (albeit with simpler assets than console games) and it’s probably only 2 generations at most from supporting a 4K rendering context (again, with simpler assets than console games… for now).

The AppleTV doesn’t need to get smaller. It needs to get more powerful and maintain a $99 or $199 price point. If Apple does go into the dongle direction, I’d say that will be the final nail in the coffin for an AppleTV app store.

More Emergency Power

Eton has apparently released, but not shipping, new Emergency products which are upgrades of sorts to the ones I’ve tried so far..

I’ve tried the Eton FRX2 and FRX3 Emergency Hand Crank and Solar radio, LED flashlight, and USB charger this summer and my biggest complaint with both of those models was the use of a tiny (usable replaceable at least) NiCad battery. The not-shipping-yet new models (FRX4 and 5) with similar form factor aim to solve this with Lithium Ion batteries. I couldn’t find a capacity listed anywhere, but it claims to be able to 50% charge to “most smartphones”, whatever that means. The FRX2 and 3 don’t technically support USB charging unless you’re actively using the crank. The MSRP’s are also around double… I hope it’s worth it.

Although I just use a lead acid 12V scooter battery and clamp on what I need with alligator clips, exposed wiring and alligator clips aren’t for everyone. The reason I’ve been following Eton’s products is because they’re something I can potentially recommend to my family members. Even with my family using only LED flashlights and lanterns, after the first four days power loss from Hurricane Sandy replacement batteries were getting impossible to come by, and my power stayed out for another week.

In other news, we’ve added a Prius c to our lineup of cars. The advantages of a hybrid drivetrain in emergency situations were meaningful – not burning gas idling for an hour in gas lines (yes, that happened), no worries of draining the bigass battery with 200 watts of devices plugged into it, the fact that a 9.5 gallon tank is 450+ miles of range… I couldn’t resist getting another. Unfortunately, neither my apartment nor job can accommodate charging for an EV or plugin hybrid, plus a straight EV would be quite screwed in a 2 week long outage anyway.

Update:
The FRX4 has a 1000mAH battery and the FRX5 is 2000mAH. Some iOS device capacities:

Device mAH
iPhone 5S 1440
iPad Mini 1 4440
iPad Mini 2 6471
iPad 2 6944
iPad Air 8827
iPad 4 11560

So should have no problem recharging a single iPhone with the bigger FRX5 on a daily bases on solar power alone. All other combinations of Eton and Apple devices will require cranking – not that that’s necessarily the end of the world. Alkaline AAA batteries (of which it accepts three) are typically 860-1200 so three is 2580-3600mAH total but I would avoid wasting non rechargeable technology for that and keep your AAAs for your Maglites.

Brief: Thoughts on SmartWatches

I’ve always been a wearer of face watches ever since fourth grade – cheap <$50 ones that became increasingly difficult to replace the batteries on. When one of my watches finally only lasted about 3 years, I decided to try using only my iPhone (then 3G) for checking the time. I remained like that until January of this year when heated speculation on "wearable computing" from Apple made me decide to get used to wearing something on my wrist again. A month later, I've found I'm checking my wrist for things it can't tell me: the current weather, and the Find Friends location of someone I'm expecting. I actually get a slight annoyance when I have to pull my phone out of my pocket to check these things. So, is there room for some sort of device on your wrist with more info than a watch? Probably. But: it'd have to have superb typography (which is why I use a face watch), probably e-ink and a Paperwhite style backlight, and be really, really small - two reasons I've never even entertained the idea of the existing products on the market. The screen size of the (6th?) gen iPod nano "watch" is the right size, but there's too much device around; but not beneath it. This is actually the size of the Shuffle but 0 bezel on all sides is probably impossible. Without the clip it's actually already thin enough and actually thinner than my watch. Still, I'd rather see a display similar in quality to the second generation Kindle Paperwhite rather than some magical LCD that works in direct sunlight without blowing out the battery.

Don’t Worry, Apple isn’t done showing us 4K

My perfectly functioning 2000 G4 Cube, 2003 12" PowerBook G4, dead original iPod, original 2010 11" MacBook Air, iPhone 5S, and 2013 Mac Pro

My perfectly functioning 2000 G4 Cube, 2003 12″ PowerBook G4, dead original iPod, original 2010 11″ MacBook Air, iPhone 5S, and 2013 Mac Pro

I finally received my 2013 Mac Pro (I guess I shouldn’t say finally if most of you won’t get yours until February) and not content with the 4K offerings from Dell and Sharp, tried it on my Seiki 55″ UHDTV then used it on my existing 24″ LED cinema displays.

Even after enabling HiDPI modes using Quartz Debug, System Preferences (and the EyeFriendly app store app) would only see the TV as 3840×2160. I had to use the trusty RDM tool to let me run the display at 1080p HiDPI. When I did that, btw, I sat there for 40 minutes looking at my vacation photos for what I argue might be considered the “first” time. Wow. 4K is far from being “unnecessary”.

This is only half the reason I think Apple has more 4K to show us. The real reason is that when I plug the retina 15″ (first generation) into ANY display, RDM gives me ALL of the “More Space” resolutions from 1024×640 HiDPI to 1920×1200 HiDPI at 16:10. It also offers 1080p HiDPI, 4096×2160, 2048×1080 HiDPI, and 2048×1280 (only low DPI), which is a 16:10 resolution, not quite as high as the 2560×1440 of the 27″ displays, but not as low as 1920 wide either, and 4096×2560, which would be 15,099,494,400 bits per second in 24 bit mode, but I think displays are moving towards 30 bits, and would unfortunately be 18,874,368,000, which is above Display Port 1.2 17.28Gbps. Both 3840×2400, my preferred resolution, and 4096×2160 ‘Cinema 4K’ fit in DP 1.2 at 60 Hz.

Because the retina MacBook Pros have more features than the Mac Pro, even with far less capable hardware, I think Apple has lots more to show us. I just hope we don’t have to wait until they can do 27″ displays. I’m content with the HiDPI versions of my 24″ LED Cinema Displays (16:10, not 16:9) because I want to fit 3 on my desk.

Calculations for 4K bitrates at 24 and 30 bits per pixel and 60 Hz.

Calculations for 4K bitrates at 24 and 30 bits per pixel and 60 Hz.

Hyper-V on VMWare Fusion on OS X

The reason I’ve been quiet lately is because I’ve been busy building a Microsoft SharePoint site. Without discussing what’s right or wrong or why let’s get to how I’m doing this using only Apple hardware in a way that’s ready to be moved to physical Windows Servers.

Before you get started, there’s a decision you should make upfront – will these virtual machines need to be visible outside of the network between your Mac and them? If so, make sure you Bridge the network interfaces between the VMWare machine and your Mac rather than NAT. The Hyper-V machines will be bridged to this connection either way.

Click “Customize” at the end of the VMWare quick install wizard. Put the VM somewhere better than your home folder.

  • Set your startup drive to NOT “Split into 2GB file”
  • Add a very large second disk. This is where the Hyper-V machines will actually be stored.
  • In the Processor settings, “Enable HyperVisor Applications”
  • Also give this machine as much CPU and RAM as you can.
  • Edit the virtual machine’s .vmx file (VMware machines are bundles like OS X Applications) and add: hypervisor.cpuid.v0 = "FALSE"

Run the windows installer. Enable the Hyper-V role.Do Windows Updates. About 90 minutes and 17 reboots later (or just skip Windows Updates if you dare) you’ll be ready to start adding Hyper-V machines. Except for one small problem: that giant second HDD you attached via SCSI to Simulate a RAID controller isn’t available.
Open up a command line and run disk part or diskpart.exe or DiskPart (they all should work).

san
san policy=OnlineAll
list disk
select disk 1
attributes disk clear readonly
online disk

Your disk may not be disk 1; use whatever makes sense based on the results of list disk.

Running the drive partitioning utility will not prompt to format the drive. Once done, you can now follow best practices and store Hyper-V machines on a drive that isn’t startup.

iPad Mini Keyboard Update

For the last year I’ve been using the Logitech Ultrathin Keyboard mini for my iPad mini, and while I love it, I decided to get something else that’s basically the same but with illuminated keys (Winter is dark).

IMG_0599

IMG_0601

And although I prefer the space saving tricks that Logitech uses, I acclimated to the new one in about 25 hours, including the stupid placement of the apostrophe. So for now, stupid generic but lights up wins. Sorry Logitech.

There’s more to Jersey than the smells of the Turnpike

Why ex-Rutgers receiver Kenny Britt hates New Jersey really rubbed me the wrong way, so I compiled some photos I’ve taken on Rutgers property right down the road from his former stadium to illustrate the point that you can’t judge a place based on what it’s like to drive by it.

Rutgers

The Rutgers ecological preserve is 370 acres of mostly untouched wilderness (except for the areas that would already developed when Rutgers protected the rest and stuff like trail markers and bird houses). The ecological preserve doesn’t have a parking lot which probably helps keep it as untouched as it is.

DSC03263

DSC03271

DSC03490

DSC03504

DSC03508

DSC04397 copy

No use for Turbo Boost, bring on the Cores

Famous 2012 Mac Pro buyer Marco Arment has put together a list of probable Mac Pro CPU options. Marco ends his piece with recommending the 6 core as a probably sweet spot of cost, cores, and turbo. He’s probably right about that. But let’s pretend cost was no object (or Apple takes higher margins on the low end rather than the high end or some other crazy nonsense). If we’re just talking about getting the job done using only one processor, what kinds of tasks are likely to be hurt by having too many cores and a weaker turbo boost?

In my life: none. Even the most out of date things that render I still occasionally use (Final Cut Pro 7, other 32bit renderers) comfortably max out 4 threads when rendering. Everything else (Final Cut Pro X, Premiere CC, other 64bit renderers) all use at least 8 threads, which is all I can confirm on my dual quad 2.8Ghz 2008 Mac Pro and a Quad i7 retina MacBook Pro.

The most recent thing I did that took a while (20 minutes) that couldn’t’ve been improved by throwing more cores at it happened to also not be something that could have been made faster by a higher turbo, since it was only using about 10% of the CPU the whole time. It was a command line php script (that’s right) that took 2GBs of IIS log files and dumped them into a mysql table so I could get some analytics out of it. The bottlenecks were mainly disk IO (even though the logs were on my local SSD) and the minimum time for a SQL INSERT to execute (even with a persistent connection).

Another reason to prefer more cores, even if slower, is because there’s a lot of crappy software out there that you can’t get out of using. For whatever reason, Adobe updaters sometimes block and eat up 100% of a single [processor] thread. The more cores you have, the less collateral damage to processes that deserve the CPU. (If Adobe’s never going to get on the App Store train, they could at least use standard Installer.app .pkgs… that would be nice…).

On that note, giving virtual machines less than 100% of your physical cores to work with can help prevent them from locking up your entire system. Because I’m working on developing an Intranet right now, I have multiple Virtual Machines with various versions of Windows and IE constantly running, all quarantined to one or two cores and maybe a gig of RAM. The Intranet is also hosted by yet another local virtual machine. Although this causes my retina MacBook Pro to reach 160°F if I don’t use SMCfancontrol and/or a supplemental USB fan, performance never suffers except inside the virtual machines themselves as I only have so much RAM given to each of them. RAM is actually the lower ceiling on how many VMs I can run at a time.

When I’m not rendering anything, even with all those virtual machines sitting there serving and receiving http requests my CPUs are generally at least 90% idle. Whenever I do anything CPU intensive however, it likely is using as many cores as possible, so it’s only going to get a 133MHz Turbo per core, in which case, more is better. As I’m only used to maxing out 8 threads, I might be able to squeak by with the quad core model as long as I can affordably stuff enough RAM into it. Currently, Crucial is assuming this 16GBx2 kit will work and costs $439.99, making it $879.98 for the full 64GB, which IMO is a more worthwhile upgrade than going from 4 to 6 cores for the same price (though I’m sure it’ll be $2000 in Apple RAM prices).

Interestingly, going from 4 to 6 cores also gets you a GPU bump. If you dream of using the maximum of three 4K displays and still having enough VRAM available to do more than render the UI then going with the 6 core just for the presumably cost effective GPU boost is a no brainer.

This is a long way of saying it, but, as expected, Apple has provided a comparably “useless” entry level Mac Pro alongside one that’s much more cost effective, but probably still needs some BTO options. Now we wait to see how much those options actually cost piecemeal. I still want 8 cores because I like doubling things for updates, and my 2008 Mac Pro has 2 single threaded quad core cpus (8 threads), so an 8 core hyper threaded CPU would be 16 threads. And there’s no question I need ALL the RAM you can put in it, especially while I’m virtualizing my Sharepoint development servers on it.

Whatever the cost though, if you need the new Mac Pro to do your job, and you’re your own boss, don’t forget that development hardware is tax deductible, and maybe even eligible to be tax free in your state.