Thursday, December 31, 2009

FIOS: Not yet

I have recently been motivated to pay less for my TV & internet service. This was really driven by the fact that my monthly bill for TV & internet from Cox is $132 and change (roughly $10 of that in taxes).

I would think that FIOS would be a good alternative. They are the upstart, working hard to gain market share and so on, right? Apparently not. I'm not sure why, but they are not willing to provide me a bundle that is a lower price than Cox. In face, when I price it out, the price is exactly the same. Here's the scoop:

FIOS:
The key problem here is that you need to pay monthly for a set top receiver for every TV that you use. This is problematic when you have up to six TVs that you'd want to be able to get service. The following assumes that we only hook up three TVs: HD on the big TV and the master bedroom and SD TV on one other. This adds $26/month to the cost of the service. Outrageous - especially since Cox doesn't require a box for any connection. And, we can get HD on our TVs without any box (so, its not just SD TVs that this works for).

Sadly, this setup doesn't even include a DVR (which we currently have with Cox). Note that I haven't done a channel-by-channel review to see what service has what. Both include the core channels that I care about: the networks, ESPN, USA and SyFy. I don't really think that there are too many others.

FIOS requires a year-long commitment and will guarantee your price for two years. So the bottom line price (before taxes) is $120 per month.

Here's a picture of how Verizon separates out the various charges:



Cox:
Cox's bill is broken out like the following:


It is worth noting that the Cox bill is before the $12/month discount that Cox extended to me for agreeing to a year-long term (the same terms that Verizon is seeking).

Conclusion:
Its always easier to do nothing (i.e., not change service) and I think that's what I'll be doing. A potentially, slightly faster internet connection is not sufficiently interesting to mess with the switchover.

The $150 incentive that Verizon is throwing out there to switch is not sufficient to cover my switching costs, which I consider to be the following:

  • Overlapping service: I won't switch off my current internet service until I'm convinced that my new service is working reliably and the way that I want.

  • Tearing up of my "lawn". OK, so it may not be the greenest and densest lawn in all of Northern Virginia, but I still don't want to have to deal with the installation.

  • Uncertainty: I may have missed something in my analysis. The FIOS service could be lacking in a way that I'm not comfortable with.

  • Flexibility with TVs: with FIOS I have to pay extra for every single additional TV that I might want to have hooked up. Not something I'm OK with doing, even though I don't have plans to do so and despite the fact that my kids watch more TV programming on their computers than on TVs

Tuesday, March 17, 2009

iPhone 3.0: Finally right?

It used to be said that you should never buy a Microsoft product until it was in version 3.0. I think this held true for Windows, but I'm not sure it did for many of their other products (Bob, anyone?). Apple finally announced the feature set in their version 3.0 software today. Instead of going through features, I thought I'd opine on what sort of applications with accessories could now be useful. This article inspired me to write. Here are my thoughts:

Workout Monitoring: The combination of a bluetooth heartrate monitor (one is coming, see here), bluetooth accelerometers (preferably 6 axis of this size, for both wrists and ankles. All sensors should transmit to the iPhone and should be rechargeable. These sensors, along with some intuitive workout logging software would be killer. See my earlier ramblings on the topic.

Receipt Scanner: How cool would it be if you could clip a stubby little scanner and scan in store receipts directly to your iPhone? You would never lose a receipt and could manage/print out in on a web-based companion application.

Laser Measurement: Clip a simple laser measurement tool into the bottom of the iPhone and quickly take room measurements while you sketch a map of the room on the screen. Combine with the iPhone's accelerometers and have the software draw the full map for you.

Pool/Hottub Chemistry Testing: Yes, water is not a good friend of the iPhone. But, using the accessory port, make a tool that you dip into your pool or hot tub and it would tell you all of the relevant metrics: chlorine levels, pH, temperature, etc. The application should keep track of your measurements over time (and allow you to track pool and hot tub separately).

"My Life in Words": Track every conversation you have. Record (on a VOX basis), all day long. Recognize those with whom you are having the conversations with and combine with voice recognition software to capture in easily text-readable form. Sync everyday with your computer to limit the data requirements.

Whole-house Intercom: Put each iPhone in the house capable of contacting any computer/VoIP enable intercom and let loose (as in, talk to anybody in the defined network).

Bluetooth/WiFi Kitchen Scale: Totally unnecessary, but it would be pretty cool if you had a kitchen scale that connected to your iPhone that could walk through recipe and could let you know when you've put enough of each ingredient in the bowl.

Wednesday, March 4, 2009

Some Further Thoughts on TED

So, I've learned a bit more about TED and I owe their designers an apology. They do provide easy access to both MTU's data. I just had to go down the drop down box a bit further to find the separate export options. It just turns out that for me, the one MTU has an average measurement of 4 kWh/hr and the other one is 0.4 kWh/hr, making the split between the two not that interesting from me.

If we pull out the other activities going on, we can isolate the heat pump as a 9 kWh/hr load (wow!). This is the single largest and most notable load in my house and one that needs to be addressed next year.

Also, I learned a bit more about TED in that the Footprints software actually exposes its current information using http. If you go to http://localhost:9090/DashboardData on a browser on the computer that is running, you'll get the below XML. If you have a static IP address on this computer (reasonably easy to set up on a modern router/wireless access point using the computer's MAC address), you can then call the information from any computer on your network.

Unfortunately, despite all of the wondrous information (including separate real-time readings from the two different MTUs), they left out the current time. Huh? They included lots of random daily accumulations of data, but not the current timestamp. Odd choice, not quite sure why they would see fit to include the month and year, but not the full timestamp.

This seems well suited to a WAMP setup to capture the data in a much more flexible fashion (though one that's much more likely to fail at some point).

See the full XML below:

<dashboarddata>
<vrmsnowdsp>121.4</vrmsnowdsp>
<daysleftinbillingcycle>28</daysleftinbillingcycle>
<presentspendingperhour>0.00</presentspendingperhour>
<currentrate>0.0000</currentrate>
<lovrmstdy>117.3</lovrmstdy>
<stlovtimtdy>09:20</stlovtimtdy>
<hivrmstdy>124.3</hivrmstdy>
<sthivtimtdy>13:36</sthivtimtdy>
<lovrmsmtd>117.3</lovrmsmtd>
<hivrmsmtd>125.4</hivrmsmtd>
<kwpeaktdy>16.650</kwpeaktdy>
<dlrpeaktdy>0.00</dlrpeaktdy>
<kwpeakmtd>18.390</kwpeakmtd>
<dlrpeakmtd>0.00</dlrpeakmtd>
<watttdysum>3967344</watttdysum>
<kwhmtdcnt>4074.000</kwhmtdcnt>
<lovdaymtd>99</lovdaymtd>
<hivdaymtd>65</hivdaymtd>
<dlrnow>0.00</dlrnow>
<dlrtdy>0.00</dlrtdy>
<dlrmtd>0.00</dlrmtd>
<dlrproj>0.00</dlrproj>
<dlravg>0.00</dlravg>
<kwnow>2.350</kwnow>
<kwtdy>66.1</kwtdy>
<kwproj>3266</kwproj>
<kwmtd>298</kwmtd>
<kwavg>149.1</kwavg>
<co2now>3.65</co2now>
<co2tdy>102.49</co2tdy>
<co2mtd>462.11</co2mtd>
<co2proj>5062.30</co2proj>
<co2avg>231.06</co2avg>
<ledstatus>GREEN</ledstatus>
<buzzerstatus>OFF</buzzerstatus>
<pastmonthlydata>
<monthhistoricaldata>
<month>3</month>
<year>2009</year>
<dlr>0</dlr>
<kwh>26</kwh>
</monthhistoricaldata>
</pastmonthlydata>
<isdualmtu>True</isdualmtu>
<mtu1wattsnow>1.690</mtu1wattsnow>
<mtu2wattsnow>0.650</mtu2wattsnow>
<mtu1co2now>2.62</mtu1co2now>
<mtu2co2now>1.00</mtu2co2now>
<mtu1dlrnow>0.00</mtu1dlrnow>
<mtu2dlrnow>0.00</mtu2dlrnow>
<mtu1vrmsnow>121.4</mtu1vrmsnow>
<mtu2vrmsnow>121.4</mtu2vrmsnow>
<demandusage>8.524</demandusage>
<demandcharge>0.00</demandcharge>
<energycharge>0.00</energycharge>
</dashboarddata>

Tuesday, March 3, 2009

TED: Voltage Sag

Not sure if its meaningful, but there is a pretty strong voltage sag when my house draws a lot of power from the Dominion system:



The R^2 of the relationship is 58% and the T-stat for the KW variable is 218 - I'd consider this significant. This graph was generated in 'R' using the following commands:

plot(ted$KW, ted$VRMS, xlim=c(0,20), ylim=c(117,123), pch=3)
lines(ted$KW, lm(ted$VRMS ~ ted$KW)$fitted.values, col="red")


In any event, the voltage ranges appear to be well-within the current standards (if Wikipedia is correct):
In the United States and Canada, national standards specify that the nominal voltage at the source should be 120 V and allow a range of 114 to 126 V (-5% to +5%). Historically 110, 115 and 117 volts have been used at different times and places in North America. Main power is sometimes spoken of as "one-ten"; however, 120 is the nominal voltage.

Some Thoughts on TED

Thoughts about the Gadget:

Too Much Juice: TED delivers what it promises, but in surprisingly non-green way. I say this because the RDU (the "Receiving Data Unit" - the little white box with the LCD display) does not cache any of the data that it collects. Therefore, if you want to be able to analyze anything, you have to keep your computer on and logging data. For as cheap as flash storage is these days and as compact as these files are - why not throw a couple of GB in there and let the user download to their computer 1x per week (or live if they choose). TED brags that it only consumes 6 watts. Sure it does, but when you add in my computer, then you get a number roughly 10x that. So, if the TED designers didn't want to pay for the flash memory, fine, include a USB port that was thumb-drive compatible. Keep logging onto that device until it runs out of space.

Weak Software: The Footprints software is pretty much useless at this point. It installed just fine and connected to the RDU (the most important aspect) and began logging the information. Unfortunately, now I just get a blank white screen with the same symbol my browser gives me when it can't find an image. Impressive. And, if I turn it off to troubleshoot, it will lose data. Great.

Stupidly Aggregating Data: The TED 1002 model that I have gets two separate signals from the two separate circuit breaker panels that the MTUs are located in. Why does the log only include one total consumption number? The whole point of this device is to measure usage so that it can be curbed or eliminated. TED loses valuable segmentation on where the load is coming from when it is available for free. An astoundingly bad design choice.

Thoughts about the Data:
My average electricity usage over the displayed time is 5.01 kWh/hr - pretty high. That would mean (if I assume a similar average profile over the course of a month) that I could expect and electric bill of $328.01. This is built up on an $0.088/kWh (I believe the current Dominion average rate) * 744 hours in a 31 day month * 5.01 kWh/hr.

Last month (actually January 12 - February 9), I used a total of 7616 kWh, which I find astounding. That equates to roughly a 10.5 kWh/hr average usage rate. Based on what I've seen so far, that basically means that the heat pump (using resistive heat) upstairs ran continuously for the entire month. The heat pump must go!

Further Extensions:
It looks like there has been some hacking about the TED database. This could allow me to set up a process by which the tick data is uploaded real-time to a local MySQL server and the data available for jpgraph display and analysis in 'R'. Sounds like a good project for the kids. . .

First TED Reading

Here is a quick view of my first overnight energy readings from TED:



This was generated in 'R' using the following commands:

# Read in data
ted <- read.csv('User/markb/Desktop/Recent_TED.csv')

# Format so TIMESTAMP can be used with POSIXct
ted$time <- as.character( strptime(ted$TIMESTAMP, "%m/%d/%Y %I:%M:%S %p", tz="") )

# Plot the data
plot(as.POSIXct(ted$time),ted$KW, type="l", bty="L", col="red", ylab="Usage (kW)", xlab="Time (each point is second)", ylim=c(0,20), main="Electricity Usage Patterns")
abline(h=0)

# show baseline usage
abline(h=mean(ted$KW[ted$KW < 2]), col="gray")

# add voltage information
par(new=TRUE)
plot(as.POSIXct(ted$time), ted$VRMS, axes=FALSE, bty='L', ylab='', xlab='', type="l", col="gray", ylim=c(100,125))
axis(4)
mtext("Voltage (VRMS, in gray)",4)
abline(h=120, col="gray")

# go to bed
abline(v=as.POSIXct("2009-03-02 22:00:00"), lty=2)
# wake up
abline(v=as.POSIXct("2009-03-03 06:00:00"), lty=2)

Sunday, March 1, 2009

TED: The Energy Detective

The last two months we've gotten some pretty outrageous electricity bills. My wife's first inclination was that the furnace upstairs was not a propane-fired furnace, but rather a heat pump. It would appear that she is correct.

However, that wasn't enough for me. I wanted to see where we were using our electricity over the course of a longer period of time. To that end, I purchased TED: The Energy Detective. It came on Friday, today I installed it.

Here's the content of the package (not showing the software CD):


The contents of the brown box, are just more of the same of what is laid out on the table: two more round clips (called Current Transformers or CTs) and the Measuring Transmitting Unit (or MTU). The MTU is in a plastic housing which itself doesn't feel too cheap, but the connector (for the two CTs), is horrible. I had issues with both MTUs of getting it to connect - it took way too long and way too much fiddling around to get it done. In addition, the second MTU's plastic connector was loose on one side of the MTU. Doesn't give one a whole lot of confidence in the build quality of TED.

Installation:
To install TED, you have to open up your circuit breaker and potentially expose yourself to all sorts of dangerous shock hazards. Anything I write about here is not instructional in nature and none of it should be relied on to do this yourself.

First, let's look at the electricity setup of my house:

There are two parallel 200 AMP circuit breakers next to each other. The loads that they serve are obviously unique and here is how they are split up:


The first step in dealing with the breakers is to take off the front panel (there were for screws to do this). You may want to throw the breaker first - I chose not to because I wanted to do a before and after to ensure my multi-meter was working and that I wouldn't fry myself. Specifically, after removing the cover, I checked the voltage from the screw on the top most circuit breaker to the neutral bar. It read 110 Volts. After I flipped the main circuit breaker (which took a surprising amount of effort), it measured 0 Volts. After I took the cover off, this is what I saw:

IMHO, a pretty nicely organized box. At that point, there were several things to do: clip the CTs around the two main wires coming into the box, like so:

The next step is to connect the power for the MTU. This is done by connecting the black and white wires. For the white wire, you loosen one of the neutral bar screws, push the wire tip (it is pre-stripped for you) in and tighten the screw back in. Ideally, you would install another 15 Amp circuit breaker and connect the black wire to it. In my case, I loosened the screw for an existing 15 Amp breaker and added the black tip. At that point, I turned back on the breaker to make sure the MTU was functional. The little green LED started flashing, which I assumed meant that I was in business.


I went back upstairs to check out the TED Receiving Display Unit (RDU) to see if it could read the signal. To do so, I was told in the instructions that because I had the 1002 model (for 200 Amp service), I had to follow the instructions that came with the second MTU. Those instructions were a bit obnoxious (lots of repetitive steps referring to go back and repeat steps 3 & 4 - really, is ink that expensive that you couldn't have printed out each step?). I was quickly able to verify that the RDU was reading the signal from MTU.

I headed back down stairs to install the second MTU on the second circuit-breaker. After doing that, I added the second MTU signal to the RDU and I was in business reading both signals. For me, it was pretty obvious that it was, in fact, reading both signals. That was because the baseline power consumption on one was only 1.2 kWh/hr while when I added the second, it jumped up to ~5 kWh/hr.

At this point, I am data collection mode. I would like to wait until I have enough data to start being able to do some interesting data analysis. I bought the TED Footprints software with model, which is absurdly only Windows compatible. I'll report later on how well it works in identifying usage (primarily opportunities to reduce usage). Already, we've seen the instantaneous usage move from 2.4 kWh/hr to 11 kWh/hr. Looks like we may have room to reduce some usage.

Sunday, February 22, 2009

Geotagging in iPhoto '09

So today I was able to figure out merge my GPS track from my Garmin Forerunner 405 watch, with my photostream. Not hard, just required some poking around on the internet for the right software and some trial and error in the process. I'm sure that there is an easier way, but this is the way that I made it work. So here's what I did:

1) Take photos; record GPS track.
2) Download GPS track to Garmin Training Center
3) Export TCX file to EveryTrail and create a trip
4) Download the GPX file from EveryTrail
5) Load new GPX file into GPSPhotoLinker
6) Connect camera to Mac, but hold off on importing photos to iPhoto
7) Drop pictures that are still on the camera to GPSPhotoLinker (Note: it turns out my camera was off by an hour; GPSPhotoLinker had an easy way of changing everything by an hour and then matching up with the GPS track.)
8) Manually link and save GPS coordinates of pictures.
9) Import pictures into iPhoto
10) Enjoy "Places" - it all worked.

A few notes:
- This would be much easier if Garmin allowed you to easily control the kind of file that you were exporting from their software. So far, I haven't found any options that would allow me to, though without the heartrate information, the PC version spat out a GPX file directly.
- iPhoto should be doing this directly: hello, anybody home at Apple; I thought they were all about making things easier. GPX is pretty universal.
- This is too cumbersome to worry about past photos, but I will definitely be adding this information to future streams. I will also be wearing the GPS watch on other photo expeditions (around DC or any other places that I'm taking pictures).

Tuesday, February 17, 2009

"Faces" are good - but I want people

Just got the new iLife '09 package from Apple. Fine package and I do like the addition of "places" and "faces" in iPhoto. I was inspired to write this post given some limitations that I see in the latter, but decided that I should really take the discussion to the perhaps absurd, but complete view of meta-data that I would like.

Let's take stock of how we are doing with meta-data for photos. Here's my list of desired meta-data and my view of where we are currently (in order of easiest to address and most useful to hardest and most obscure):

1) When: this was the first and easiest issue for people to both understand and deal with - its even available in the file properties as the creation date. This has been, almost by default, become the most common and useful way of organizing large photo libraries. The "grand daddy" of all meta-data for your pictures.

2)Picture Orientation: this one is dealt with in Exif data, but incredibly many point and shoot cameras still do not provide this information. My DSLRs do just fine - its not a big deal, but it sure is an annoyance to rotate all you pictures when you import. This also represents the category of "data lost". As in, the information was available for "free" if the hardware had the where-with-all to capture it.

3) Why (event): iPhoto introduced this a few versions ago. Seems to be a useful way of grouping (e.g., "John & Debs Wedding Weekend") photos that span a medium term of time, but a bad way of trying to implement tags. Overall, reasonably useful, but I believe on the whole implemented in a very proprietary manner.

4) Where: Geotagging - good stuff, but not quite ubiquitous. Would like to see some super easy to use software to merge trail information with a photostream (I've read that it exists, but haven't been able to find it). This should be built in to the likes of iPhoto.

5) Who's in the picture: "Faces" (in iPhoto) are great, but I have tons of pictures that are full of other body parts and I would like a fast way of tagging all of my historic and new pictures with that information. There seem to be plenty of ways to use pattern recognition (not just of faces) to make this super easy. For instance, when I take several pictures of people just moments apart, it is probably a reasonable assumption that they have the same clothes on (not true for all of us, but for most of us). Why not use this information (and that of the human form) along with the genius facial recognition software to make this a "Who" not just "Face".

6) How (exposure, etc.): Nicely dealt with (for the most part), with Exif data and potentially XMP (if anybody other than Adobe ever supports that). Mostly us photo geeks like this stuff - but eventually most people might care.

7) Who's taking the picture: This comes from a strong bias in the photostream at my house. As in, I'm almost always the one taking the picture, not the one in the picture. Its not that important, but come on, let's collect this information!

8) Orientation: (i.e., elevation and compass direction) Location is good but location, plus camera elevation and camera direction completely define the source point of the picture. How cool would it be to see a virtual panorama on g-maps if done on the fly by the mob?

9) Atmospheric Conditions: I think it'd be useful to capture all of the atmospheric conditions (weather information) that were present when the picture was taken. Much more of an issue for outside photography, but the first layer of information would be whether the picture was taken inside or outside. Then, why not store temperature, humidity, cloud cover, precipitation, wind speed, etc. Most of that information could be merged at a later time from outside sources based on location and timestamps.

10) Light augmentation: This is one for the photography geek in me. I'd like to be able to track all manner of light augmentation/modification: flashes, filters and reflectors. Useful information, probably most of the usefulness of this would be in the professional/studio setting. EXIF also deals with this to a limited extent, but not generally captured well.

11) Spoken commentary: How cool would it be that you tag your photos will a few comments about what was going on. Entry should be facilitated by the camera, but there will need to be a universal way of linking the audio file to the picture file for this to be effective. Speech recognition would eventually catch up regular folks and would allow us to have a treasure trove of useful information in our photo archives.

Finally, I believe that all of this information should be able to be stored in the photo files themselves in an open format. I'm not too happy with the fact that I'm a hostage to all of the tags that I've spent time with in iPhoto. Faces and Places just make this dependency even deeper. My only hope is that some day there is a way out (somebody figures out how to write an export routine).