Studio711

Washington Ornaments

Evergreen Lutheran High School in Tacoma has a fundraiser auction every year. Every year I think about making something and then fail to do so. This year I made it under the deadline by about two hours.

I only had five days to whip something up so I decided to completely steal an idea from Nick at 6_8woodworks, and make some ornaments out of laminated scraps. Thankfully I had enough interesting pieces of the right sizes to make a few at the same time.

Since I have that shiny new CNC sitting there, I whipped up a drawing and was able to cut out 3 identical ornaments relatively easily. The only real trick was making the cut into Puget Sound wide enough for my 1/8″ bit to get in there. As I cut each one, I rotated my stock piece to get a slightly different pattern on each one. I finished them off with a bunch of sanding, boiled linseed oil, and some twine through a hole to hold it onto a Christmas tree.

I don’t expect these to raise a huge amount of money but it will be fun to see other people put a price tag on my woodworking. I almost exclusively make things for myself or as gifts so there’s no real price tag involved.

Patent Application

Azure Data Explorer has made a dramatic impact on my career. It has inspired a whole new breed of data engineering and it feels like a wide open playground for ideas and innovation. There were so many new ideas and patterns floating around in my head that I decided to attempt the patent process (through work) for one of them. I’ve never been through it before and it was interesting to see all the different levels of scrutiny and checks that go into it before you even sit down with a lawyer to start drafting the application.

I’m thrilled to announce that I’ve completed all of that work and my patent application has been submitted! Unfortunately… I’ve been advised not to share the details of it yet. After about 18 months, the US Patent Office will publish the application. At that point it will be public information on their site but it will still take another 2-3 years from that point for them to review it and either approve it or ask for some more information.

So I guess the point of this post is to say that I’m really excited about applying for my first patent. Even if it doesn’t get approved, it’s neat to see how the process works and it has me thinking whether or not other ideas are patentable too.

Computer Upgrades

It dawned on me recently that my main home desktop is coming up on seven years old. SEVEN YEARS. I used to be happy if I got four years out of a computer and here I am at 7 years and I can’t come up with any reason why I’d need to upgrade. I took a look at CPU benchmarks and stuff in my price range would only be a ~30% increase of what I have now. Increases in RAM speed and major increases in SSD technology would definitely give me an improvement but I can’t say that I’d notice it much with my use case. I love getting new computer gear, but I think it’s going to be a while before that happens again.

This seems like a good excuse to update my computer ownership history though. The ones in italics are still in use.

  • 1998 – Gateway Pentium 2 350 with a 10GB hard drive and a tape backup.
  • 2002 – Dell P4 2.4GHz with 512MB RAM and an 80GB hard drive. $900
  • 2006 – Dell Core 2 Duo E6600 2.4GHz with 2GB RAM and a 250GB hard drive. $1200
  • 2010 – Core i7 860 2.8GHz quad core with 8 GB RAM. $1000 (Replaced motherboard and CPU ins 2014 for $260)
  • 2011 – Lenovo Thinkpad Edge $700
  • 2012 – Core i7 3770 3.4GHz quad core with 16GB RAM. $1400.
  • 2013 – HP Pavilion Touchsmart 15-b154nr AMD A8-4555M quad core 1.6GHZ and 6 GB of RAM. $550
  • 2015 – Dell XPS 13. $800
  • 2016 – Intel Core i3-6100 CPU with 8GB RAM. $360

I suspect that the next thing we’ll replace is the laptop only because that gets more abuse than the desktop machines. I’ve been very happy with the XPS 13 though. It has held up much longer than our previous laptops and isn’t showing any signs of impending doom.

Snow Storm Recap

We survived Snowmaggedon 2019! February was the third coldest February on record in Seattle and all that cold weather meant that our normally wet weather ended up being snow. The snow kept coming and coming over many days and schools were closed for the majority of two straight weeks. The official total ended up being around 20″ which puts it just behind the 2008 storm and almost double the 2012 storm.

Thankfully it didn’t have a huge effect on us other than some canceled school days. We had plenty of food in the house and we were able to restock easily by walking down to Safeway. I kept waiting for the power to go out towards the end of the storm cycle when we had a really heavy snow, but it never went out for more than a few seconds. I guess I’ll have to keep waiting to use my fancy transfer switch that lets me plug the generator right into the electrical panel.

Elijah LOVED all the snow. He spent a ton of time outside with Tyla sledding down the street, building forts in the snow piles and playing with all the neighbor kids. A huge thanks goes out to Tyla for all the time she spent playing with him!

Analyzing Water Data in Azure Data Explorer

One of my favorite systems at work officially launched a couple weeks ago as Azure Data Explorer (internally called Kusto). I’ve been doing some blogging for their team on their Tech Community site. You can see all my posts on my profile page. This post will use Azure Data Explorer too but I thought it fit better on this blog.

A year or two ago, our local water company replaced all of the meters with digital, cellular meters. I immediately asked if that meant we’d get access to more data and they said it was coming in the future. The future is now! If you happen to live in Woodinville, you can get connected with these instructions.

The site is nice and lets you see charts, but by now you probably know that I love collecting data about random things so I immediately tried to figure out how to download the raw data. The only download directly supported form their site is the bi-monthly usage from the bills, but from the charts, I could see that hourly data was available somewhere. A little spelunking in the Chrome dev tools revealed the right REST endpoint to call to get a big JSON array full of the water usage for every hour in the last ~11 months.

I pulled that into Azure Data Explorer and started querying to see what I could learn. This first chart shows the median water usage by three hour chunks of the day. Tyla and I usually both shower in the morning so it makes sense that 6-9am has the heaviest usage.

WaterUsage
| summarize 
    sum(Gallons)
    by Hour=bin(hourofday(Timestamp), 3), bin(Timestamp, 1d)
| summarize percentile(sum_Gallons, 50) by Hour
| render columnchart  with (title = 'Median Water Usage by 3 Hour Bin', legend = hidden)

I feel like there’s probably a better way to do write the next query, but this works. It’s the cumulative usage throughout each month. The four lines at the top of the chart are the summer months when I’m using the irrigation in the yard. The lines that drop off at the end of the month are because I ran the x axis all the way from 1 to 31 for every month so months don’t have enough data, but it still conveys the general idea. It’s interesting how similar all the non-watering months are.

union
(
    WaterUsage
    | summarize Gallons=sum(Gallons) by bin(Timestamp, 1d)
    | extend Month=monthofyear(Timestamp), Day = dayofmonth(Timestamp)
),
(
    // Original data had some missing rows
    datatable(Timestamp:datetime, Gallons:long, Month:long, Day:long)
    [
        datetime(2018-11-26T00:00:00.0000000Z), 0, 11, 26, 
        datetime(2018-11-27T00:00:00.0000000Z), 0, 11, 27, 
    ]
)
| order by Timestamp asc
| serialize MonthlyWater=row_cumsum(Gallons, Month != prev(Month))
| project Month, Day, MonthlyWater
| make-series sum(MonthlyWater) on Day from 1 to 32 step 1 by Month
| render linechart with  (ycolumns = sum_MonthlyWater, series = Day, Month, legend=hidden, title='Cumulative Gallons By Month')

The data is in 10 gallon increments so it’s not super precise but it’s a LOT better than the two month resolution I had previously. I’m excited to play around with this data and see if we can start decreasing our usage.

Along these same lines, I heard that the local power company is starting to install power meters with Zigbee connectivity so there’s a chance that I’ll be able to start getting more insight into my power consumption in a similar fashion…