Wine Cooler Power Supply

I’m just going to blog about this one briefly, in case their are any other enterprising Wine Cooler owners out there who find that, two years into owning their cheapo wine cooler, the power supply craps out. I didn’t take picks all the way through, so it won’t be much of a step-by-step build log.

A couple years ago I bought a “AZ-EA45EC-75” on Amazon. If you read reviews for these wine coolers, you WILL notice that there are more complaints about the cheaper ones breaking down after a year or two (often out of warranty). But I figured the price differential was worth it, especially if I got one  that is cooled via Peltier thermocouples, since I’d have a chance at repairing it if something went out.

Sure enough, a few months ago, the power supply went out. I can troubleshoot basic electronics, but troubleshooting a switching power supple (which this thing had) is a bit beyond me. I did try replacing some darlington transistors that showed a lot of heat discoloration on the circuit board, thinking they might have burned out (one was not very well secured to the large heat sink). No go. Maybe there was a cascade failure or something…i think maybe the winding on one of the very specialized transformers was also out. I even succeeded in tracking down the Chinese manufacturer of the power supply (the website was printed on it), and emailed them to see if I could just buy one unit. After a couple tries, I actually got a response (in bad english) that I think said I could send it to them and they would repair it for, like $80, and send it back..with me paying shipping both ways. No thanks.

In pulling the thing apart, I learned that the cooling system consists of the switching power supply, two thermocouples – each with an outside and an inside fan which look like typical PC power supply fans – some kind of internal temp sensor, and a little control board that includes the temp control buttons and current temp readout. Oh, and a little LED light array inside that you can turn on and off with a button on the control panel. They all connect to the power supply. The thermocouples and fans are just 12V powered, but clearly there is some logic somewhere to throttle how much power goes to them, since it would work harder when trying to cool down, but bee nearly silent when the internal temperature matched what you set the controller to.

The controller board had a 5 pin connector, and it was actually labeled: 12V, Gnd, LED, NTC, and PWM. I guessed (correctly) that LED controlled power to the light – I think it was just wired straight though on the power supply, since 5V regulated came out on that pin, and the LED array required 5V to light up. PWM was interesting – my guess (again, correctly) was that they Pulse Width Modulated the power to the two sets of thermocouples and fans. To verify this, I finally broke down (after 30+ years!) and bought  an inexpensive USB-based oscilloscope. I bought a Sain Smart DDS-120 for about 150 bucks and it was totally adequate for my needs…and now I finally have at least a cheap o-scope!

Anyway, using the o-scope, I verified that there was a 5V square wave coming out of the PWM in. Actually,  first I had to figure out what “NTC” was. Initially, providing 12V to the controller made the LED readout light up fine (and the LED button controlled whether 5V came out on that pin or not), but the readout would flash a few times, then show “E1”. I though on it a while and figured maybe it needed to sense a temperature to work. Turns out, it’s common to sense temperature with is an “NTC”  thermister :-) I wired a resistor in the 10K range across the NTC pin and Gnd and – sure enough – the readout would show 59°. When I turned the temp down below 59, the “off” period of the PWM square wave was longer – when I set the temp over 59, the “off” period got shorter. So, the “off” time of the PWM theoretically drives the power to the thermocouple clusters…kind of an inverse relationship..not a big deal, since I could easily invert that signal if necessary.

I started casting about for an easy way to drive a fair amount of power via the 5V logic signal from the PWM pin – from googling around, I estimated that driving a thermocouple and a couple of fans at full power could eat an amp or more. I prefer to find pre-assembled boards for more complicated circuits, since assembling a lot of discrete components can be pretty error-prone, plus I’m just not that great at linear circuit design. After looking around a while, I decided on the Monster Moto Shield. Intended to drive robot motors, this sucker can sink a lot of current. As the name implies, it’s intended to be used with an Arduino, but I just hard-wired the control pins to the appropriate logic level and tied the PWM pins to the PWM out from the fridge controller (after going through a couple of inverter components, described below).

I should say, in between all this – which took place over a week or two – I temporarily rigged up one thermocouple/fan set to just run continuously off a 12V laptop power supply I had lying around. This actually worked pretty well. Since I was just running one, it seemed to keep the cooler chilled, but not too cold. I even considered making it more permanent, and just buttoning it up and blowing off temp control, but that would have been to easy🙂

Here’s a rough Fritzing diagram of the whole rig.WineFridgeFritzing_bb

I used a simple NPN transistor, to invert the signal coming from the controller board, tying the PWM inputs of the two Monster Moto channels to the “high side” (collector) of the transistor. So, when the signal coming out of the controller is low, the driver side is high and visa versa.

Update (4/28/16): As I tried this rig out hooked up, I discovered that it had trouble getting the temp below about 58°. This seemed to be due to the fact the the PWM duty never really got above more than about 60%. I added a low-side resistor on the transistor emitter to bias the output higher, then a diode and a capacitor which effectively “store” some of the “on” energy, while the diode prevents leakage backwards. This smoothed out the output, but also biased it slightly higher, resulting in more time above the Moto shield’s PWM trigger level. I basically picked the values by just experimenting – in fact, I’m not even positive of the cap value – I just found one in my tray of caps that was in-between doing almost nothing and causing the PWM signal to basically be always-on. 

I bought a cheap 12V switching power supply capable of delivering multiple amps from China. Here’s a picture of the whole thing rigged together temporarily. It looks like a rats nest, but will look a lot better when I consolidate all this into just the power supply, Moto shield, and a little daughter board.ratsnest

The Moto shield gets pretty hot when it’s really cooling, so I stuck a couple of little heat sinks I had to the two VNH2SP30’s, and even they seemed inadequate, hence the extra aluminum foil🙂 For a more permanent solution, I’ve order a half dozen 20mm x 20mm heat sinks from Amazon that should arrive in a couple of days.

Total cost for all this – about $50. That’s not included the O-Scope, but everone deserves an o-scope! And of course, time…I spent a lot of that reverse engineering and tinkering with getting the input levels right on the Moto pins. Definitely a cheap fix, PLUS now I have something that’s much more maintainable, since I basically put the whole power circuit together myself.

 

 

 

 

Posted in Technical | Leave a comment

Lightsuit V2.0 (part 2)

In a previous post, I provided an overview of the “next generation” of my Christmas Light Suit (and various ‘peripherals’🙂  Really…it’s more like “generation 5 or 6” since the Light Suit has been through very many iterations, starting with just a couple strings of lights powered by a big-ole 12V UPS battery…

 

By this point, I had an Arduino-powered “Master” controller, which had 3 or 4 simple, pre-programmed sequences I could run, or all-on-or all-off. My suit lights were driven directly by the Arduino through a 4-port triac board, and other pins of the Arduino were connected to an XBee in transmitter, set up in “virtual wiring” mode, so receivers in the Cape, LakewayLightGuy’s suit, and the two Staffs, which had XBee recievers, echoed the pin outputs from the Arduino, effectively syncrhonizing all devices. I experimented a bit with sound-synchronization, but the ambient noise tended to be so high most places we went, it wasn’t really noticeable. There’s a lot more detail starting in this old post.

A big change was moving to ESP8266-based wireless ‘peripherals’ instead of XBee (although I also kept the XBee for now, for backwards compatibility). Although this means there’s a bit of a lag between sampling the octave levels of my sound source and actually getting a signal to the controllers, there are several advantages.

Independently-Addressable RGB LEDS

As I mentioned last post, this year I cranked it all up quite a bit, and it got even more complicated, but ultimately I hope it get’s a bit simpler. For example, Switching from commercial Christmas light strings to strings of WS2811 LED’s meant I had to incorporate a board that supported the WS2812 protocol, but I got rid of the inverter necessary to boost 12V to 120VAC and the Triac board necessary to do the switching. I also get 256 levels each of RGB intensity, addressable per bulb! By the way, WS2811 and WS2812 use the same protocol..just google for many discussions of the difference.

The ESP8266 supports the WS2812 protocol (I originally used this library, but it’s now also available via this online NodeMCU firmware build tool), so I can send RGB bytes of data over WiFi and have multi-color, varying-brightness, synchronized devices. Now, instead of just using XBee “virtual wiring” to echo the on/off state of GPO pins, I send packets of 3 bytes each to set the RGB values of LEDs on synchronized devices. Actually, packets are 4 bytes – I use a one-byte header to ensure receivers are in sync as they read bytes, although I think the UDP packets are received ‘atomically’, so this may not be necessary.  Right now, I just send one color, but I could easily make my over-the-air protocol more sophisticated and send longer strings of ‘colors’. Receivers (‘slaves’) can either use PWM to just adjust the intensity of attached LEDs, or use the WS2812 library to send data over a GPO pin to one or more addressable LED’s.

Inexpensive, Prolific Peripheral Devices

Another BIG reason I wanted to get away from XBees – they’re expensive! Like, $20/ea minimum.  With ESP-01’s running as low as a couple USD from China, you can build synchronized slave ‘peripherals’, like LED hats and batons, for as little as < $10. I got a few LED novelties from Windy City Novelties and by just adding an ESP-01 a little CR123A lithium battery, a really simple socket, a cheap switch, and wiring an output to the existing LEDs, I can have a synchronized Elf hat, for example. In this case, I just used GPIO2, in PWM mode, wired directly to the existing LED’s in the ornaments of this Elf hat.

But since the ESP8266 also supports the WS2812 protocol, by adding a few WS2811’s and wiring GPIO2 as the data pin, I was able to have a multi-color “wand” that pulsed in sequence with other devices.

IMG_20151204_083121373

WS2812 strip

In the future, I’ll incorporate WS2812’s into more accessories, like Santa hats. I bought a whole strip of them, so on a piece-by piece basis, they are pretty cheap.

It’s also worth noting that my suit itself is now a “slave” – the “master” control unit is completely independent of the suit-driving unit, also built around and ESP8266 (in this case, an ESPToy, which had built-in 3.3V regulation and plenty of GPO pins). This way, I don’t have to carry around as much hardware if I don’t care about sound and synchronized patterns across devices. Or I can even flash the suit unit as a “master” that just does simple (non-sound-responsive) patterns, since the difference between a UDP packet sender or receiver, Access Point or Station, is totally software driven. Here’s a schematic of the suit driver:SuitController

Note that this circuit includes a binary-coded rotary switch. I included this specifically so I could load more complex code which can behave differently depending on the switch setting – acting as a slave in one position, or as a master with one of a couple simple pre-programmed patterns in the others.

 

Power

Using the WS2811’s dropped my power requirement down to 5V, but this introduced it’s own complication – I had to add regulators to  drop the voltage down to 5V. Even then, I wound up using TWO regulators, both with heat sinks added to them, and they were still getting excessively hot, as were the packs of 10 AA LiPo’s I was using for power.  After the fact, I experimentally tried dropping the battery supply down to 6V and voila! Not only did nothing get hot, the whole suit ran for about an hour an a half on a singe set of 10 batteries (rigged as 2 x 5 in parallel). This is much better, and I’m still not entirely sure why. The Buck regulators I use are supposed to be very efficient, but I suspect that may not be the case under high current draw and maybe as they start getting hot. In any case, for future holidays, I’ll be putting together 6V packs.

 

More on Sound

There are a couple more aspects of the sound circuitry, beyond what I talked about in Part 1, that I should probably mention. Here, again, is a block diagram, and also a schematic, of the “master” controller.

schematic_audio

Teensy Audio Adapter schematic

I mentioned that the Teensy was really convenient for the complex audio manipulation I wanted – specifically FFT octave analysis and some audio delay, so that the audio out would be in sync with packets received by the slaves, which lagged by about 1/4 second due to WiFi propagation delay. One snag I ran into was that the RAM on-board the Teensy is only sufficient for an audio delay of about 180ms – I actually needed about 250-300ms. The ADC and DAC pins also require a bit of supporting circuitry – to level-adjust the input voltage of the Analog-In and to smooth the Digital-Out to an analog waveform, respectively. While the inventor offers an Audio Adapter that I believe addresses both the RAM issue and the Input/Output conversions, it would have been yet another board, plus, you still have to solder a RAM chip on to get the expansion and it would have cost another $20 or so, all in. The ADC and DAC example circuitry is provided in the online, graphical, audio patching tool when you choose those nodes. It’s pretty straightforward, so I just built it in myself. I had also corresponded briefly with the owner on his forum and he had suggested adding a RAM chip directly. I wasn’t sure where to start with that, but, upon inspecting the RAM portion of the schematic published for the adapter, I also realized that it’s just an 8 pin chip (available in DIP a package) connected to four pins of the Teensy, plus VCC and Gnd. I just added a DIP socket to my perfboard layout and wired it in – it worked on the first try on my protoboard!  All in all, I really like the Teensy/ESP8266 combination, especially for the kind of complex, audio-related code I needed.

Here again is the final “master” assembly, consisting of the Teensy for audio sampling and delay, passing 3-bytes, representing 3 octave ranges, to the ESP8266, which is responsible to communicating with the “slaves”. The Teensy also provides basic I/O pin-based signals to the XBee for backwards-compatibility with my original cape and staffs. and with LakewayLightGuy’s orginal “secondary”, triac-driven suit.

It’s worth pointing out that I did make a minor improvement to the XBee setup. I converted the cape over to an ESP8266 receiver, did some surgery on the 110V lights strings inside it to be many parallel sefments of fewer LEDs in series (about 15), and then used Buck/Boost regulators to boost 12V to 30V, and added a couple of darlington transistors to drive the 30V segments. These I drive via the PWM pins from the XBee, instead of just on/off IO pins. If you look closely at the above schematic, you’ll see I’ve wired both kinds of pins to the Teensy. I also changed the bases of the transistors in the staffs to be driven by the PWM pins from the XBees. So, depending on the software I flash to the Teensy and the configuration I flash to the XBee’s, I can still go with the old, on/off style of synchronized lights and pre-programmed patterns, or I can drive the PWM’s with 1024 levels of brightness, providing a much better “pulsing” responsiveness. That’s a great thing in general about this controller – now that it’s all assembled, it’s largely software configurable!

I’ll conclude this long and rambling blog entry with a link to a little video of the modified cape, along with a modified staff on the left:

Capture

Continue reading

Posted in Uncategorized | Leave a comment

Using ADC with the ESP8266

The ESP8266 family of boards still amaze me with their size and cost-effectiveness, but I will confess; the dearth of official information for some of the more esoteric features can be annoying.

For my remote garage-door sensor, I wanted to determine the angle of the sensor using a 3-axis accelerometer. The docs don’t really tell you much, but they do mention that the ADC pin can be used to measure the battery voltage or an external voltage, but you have to choose which one. I didn’t see this at first, having googled elsewhere and found an example, but from googling, I did learn that the external voltage you’re measuring has to be between 0 and 1 volts. I think you may be safe going up to 3.3, but it will read a value of 1024 for everything over 1V.

OK, no problem – I added a couple resistors as a voltage divider (see post on garage door sensor for schematic) and I started getting reasonable values instead of nothing but 1024.

Now comes the weird part.

After  putting everything together more permanently for “field testing”, I was suddenly getting nothing but 65535! So, more googling and I come across posts like this one describing how you have to define the correct app/include/user_config.h before building your firmware (guess I should have read the docs :-)). Well, I enjoy hacking with these suckers, but my enjoyment ends with building the firmware using build chains. Fortunately, there’s an online tool for building custom firmwares. This is what I had done, since I needed MQTT support. Otherwise, I just stick with downloading the latest pre-built firmware. The posts mention modifying the byte at position 107 in the esp_init_data.bin that is generated from the build as another option – presumably this is generated separately by the build tool.

So, apparently, the firmware I’m using is set to report the battery voltage. I changed my read(0)’s to readvdd33() and started getting values that varied slightly. This is weird, because I SWEAR it worked – using the same firmware – when I had it bread-boarded, but it WAS a different chip, so I guess somehow that value was set differently, despite using the same ROM image. I had generated the ROM using the online tool, so I’m guessing the generated ROM does not extend into, or include, this chunk of init-data space. I got the pre-built esp_init_data_default.bin from the source repository and used a hex editor to change the byte at position 107 to (hex) 0x33 (I had to search around to figure out the value, too – leave at zero for battery sensing). You can grab my pre-edited file from here.

Lastly, there was some ambiguity around where to flash this to. If you have a 512KB Flash space, you flash it to 0x7C00. I’m using an ESP-12 and this didn’t work. Turns out ESP-12’s have a 4MB flash. For those, it needs to be flashed to 0x3FC000. I just used the ESP8266Flasher.exe tool to flash the file to the correct location:

Capture

After flashing the file to that location, my 0-to-1024 external values were back🙂.

Posted in Technical | Leave a comment

Remote ESP8266-based Garage Door sensor

I’ve got a bunch of z-wave light switches and plugs in my house and I use OpenHab as my centralized home automation system. I have a detached garage and neighbors have experienced occasional break-ins in similar garages where people stole tools and what-no. In fact, a couple years ago, the padlock on the sliding lock  was broken (yes, sadly, I do not have a garage-door opener…really embarrassing for a home-automation guy..but I’ve got this really funky pivoting metal door that reminds me of an aircraft hanger or something. Anyway, suffice it to say, motor-driven opening would be challenging.) ON closer inspection, I realized it was because someone had been hammering on it…I guess one night while we were out of town or something, because hammering a lock on that metal door would be loud.

Now, one problem is, I’m pretty forgetful, so there have certainly been times when I forgot to lock it, or even forgot to close it. Since I already had z-wave going on, I figured I’d try a z-wave tilt-switch. I set of OpenHab so that when the garage door state changes, I get a text. It also reminds me after sundown if the door is still open. This kind of worked, but z-wave is not guaranteed delivery, and I think the distance to the garage from the nearest node, combined with the RF-blocking metal door, resulted in a pretty poor success rate reporting the ‘closed’ stated (the tilt sensor fires once, then sleeps).

So, I decided I could probably do better using an ESP8266 sensor I built myself.

My first attempt involved setting the ESP (01) leveraging the deep sleep option to wake up every 60 seconds, check the state of an input connected to to a mercury switch, and when the state has changed, connect to my WiFi network and send a message to my OpenHab server via MQTT. I should note that, since deep sleep loses the current run-state – it really just wakes up and starts init.lua from the beginnning) I used an Realtime Clock (RTC) storage register to store the “last known state”..that’s all documented here and there on the web. This work really well, but I only got about six weeks of battery life out of a 3.7V LiPo battery…really two short to be acceptable – I needed to figure out a better design.

I figure, an better approach is one where the sensor is actually powered off most of the time, then powers-on, connects to WiFi, and sends a message when the mercury tilt switch closes, then fires another message when the door closes. The problem is – if I just wire the power through the mercury tilt switch, when the door closes, power is lost and the ESP can’t send the “closed” message. I tried a big capacitor supplying some power for a bit after the tilt switch opens, but even a big-honkin’ one didn’t provide enough juice to run for more than a fraction of a second.

What I wound up doing was using the mercury switch to supply power when the door opens – so on power-up, it sends the “open” message, then I have a parallel power path which uses a P-channel MOSFET to supply power after the door closes. The gate of the MOSFET is connected to a GPIO pin, so the ESP actually powers itself off once it sends the “closed” message by setting the GPIO HIGH, thus turning off the MOSFET. Since the ESP-12 operates at 3.3V, you need a TTL level MOSFET. I used an NDP6020. Here is an example of a circuit which I kind of used as the basis for this secondary power path:relay-mosfet-p-channel-switch8

The at-rest current leakage on MOSFETs is supposed to be really low, so I think I’ll get really good battery life with this approach. I didn’t worry about the diode, and an ESP-12 is basically in place of the relay.

I did need one additional sensor, since I couldn’t think of a simple way to have the mercury switch act as both a power supply switch AND a GPI sensor. I had a couple 3-axis accelerometers around, so I used on of those, with one axis tied to the ADC pin on the ESP-12 through a voltage divider that bring the 1 to 3.3V output down to the 0 to 1V range the ESP ADC pin expects.

Here is a Fritzing and schematic of the prototype:

 

 

GarageFritzing

 

 

 

 

 

 

GarageSchematic

 

 

 

 

I’ll deploy it shortly and provide an update on the reliability and battery life in the near future.

Posted in Technical | 1 Comment

NodeMCU Custom Builds for ESP8266

OK, just a quick blurb in case anyone else runs into this incredibly frustrating experience:

I’m playing with some sensors where I’m interested in really low power consumption. Basically, I want to go into deep sleep, which only draws <100uA, and wake up every minute or so, check an input, and report it to an MQTT broker if the state has changed. I also want to read from a DH11 temp/humidity sensor every 30 minutes or so, and report the values, regardless of whether the GPIO changed. I’m mainly working with NodeMCU right now, rather than “native” SDK code.

SO…the ESP8266 has a “deep sleep” mode, called using node.dsleep(us), which really just shuts off everything but the Real Time Clock (RTC), which drops XPD_DCDC (usually GPIO15) to gnd briefly when the timer expires. Tie that signal to RST, and the chip CPU will initialize when the timer expires and run your code from boot-up. NodeMCU has nifty rtctime and rtcmem modules now that let you read the Real Time Clock value, like after you wake up, and use 128 words of space in the RTC storage to persist info between wakeups.

OK, cool, but after flashing various verions of NodeMCU to my test chips, I get nothing but the following error, presented here for search engine to hit, whenever I try to access the rtctime or rtcmem functions:

“attempt to index global ‘rtctime’ (a nil value)” or “attempt to index global ‘rtcmem’ (a nil value)”

So, my suspicion is, the base NodeMCU firmwares are not being built with the relatively new support for the RTC modules built in. BINGO! After building a custom firmware that includes the module, these functions are available.  I will say, however, that the “rtctime” function desn’t appear to work as advertised. The docs seem to imply you will get the time, including the time that has elapsed while in deep sleep. Currently, I’m not seeing this behavior…it seems like it’s initialized at every boot. Persisting values via “rtcmem” is working well, though, so I can just do simple modulus on an incrementing number I store on every 1 minute “wake” to send temp data every 30 minutes or so.

This being said, a quick shout out to the excellent site, http://nodemcu-build.com/ 

This site lets you build a custom NodeMCU firmware image off the current master (or dev) branch, and include modules optionally, as you see fit. Check it out and donate (as I did) if you find it useful. It’s SO much easier that screwing around with build chains – jsut select the modules you want an click a button – you get an email when your build is ready (almost immediately for me)… and give this guy a few bucks, he deserves it for saving you the time!

Posted in Technical, Uncategorized | Leave a comment

LightSuit V2.0

Well, it’s well past Christmas and things have finally settled down for me a bit, so I figure it’s high time I blog in a bit of detail about my main Christmas technology focus – Version 2 of the AustinLightGuy Light Suit. I’ll post links to resources I incorporated this year at the bottom of this post.

I’ve been wanting to do a major upgrade on the original light suit for some time with these main goals in mind:

  • Upgrade the strands of commercial LED Christmas lights to strings of fully-addressable RGB lights
  • Have cooler built-in patterns and improve the responsiveness of the lights to music when going in “synchronized” mode – both for the suit and synchronized devices.
  • Have MANY, inexpensive synchronized light devices
  • Crank Up the Sound!

I’ll address the last first, since it was the easiest…

Cranking Up the Sound

Turns out most places we go to in our light suits are actually pretty noisy – there’s usually lot’s of people around talking and maybe even other music sources…Christmas music on the PA around the Zilker Tree and Trail of Lights (GREAT job on the Trail of Lights this year, by the way, despite modest entry fees – kudos to organizers and sponsors), music in bars, etc. It would be cool if the Suit could be synchronized to ambient music, but it’s usually kind of drowned out by crowd noise, so I figured I’d try out somethingsuitspeakers else – crank up the volume on the suit.

The original solution, which was just a crummy little set of bluetooth speakers that paired to my phone, was way to quite. So this year, I just got a set of relatively high-powered PC speakers, rigged them up to where they could be snapped on to the new suit, and drove them directly from the line-level audio (which is actually a pass-thru on the new master controller that I’ll describe shortly). They run off the same 12V “power bus” that the whole suit runs on (more on that later). I wrapped them in nice, fluffy faux fur to make them Christmassy.

These pretty much rocked, but in bars, they still got drowned out a lot of the time, so I wound up going with the (single, for now) pre-programmed pattern.

2.0

Ok, so now on to the interesting technical stuff…

I wanted to be able to synchronize even more “remote devices” this year – beyond the two staffs, the light cape and LakewayLightGuy’s suit. But I also wanted to retain backwards compatibility with those devices so I didn’t have to completely rebuild them, as well. It just so happens that I had recent been turned on to the ESP8266 family of devices. These are basically amazingly cheap SOC modules that contain a processor, flash RAM, and even WiFi, all in really small form-factors. There are really bare-bones boards (like the ESP-01) incorporating this chipset that can be had for as low as a couple bucks from China, up to more sophisticated boards, like the NodeMCU dev board and the EspToy, which provide good on-board power regulation, USB serial-to-TTL circuitry (necessary to flash them), and LIPO battery charging. There’s a huge community of support out here for these things, two or three tool chains, lots of code libraries, and many excellent blogs, including Pete’s excellent tech.scargill.net. The incredibly low price of these things (especially vs. the pretty expensive and proprietary XBee modules I’ve been using) led me down the path of deciding I should migrate to WiFi-based synchronization using ESP8266-based boards. I played around with a myriad of them, but wound up using a NodeMCU, an EspToy, and multiple ESP-01’s.

Latency Problems

To test the WiFi concept, I cobbled together a prototype setup where I took existing Arduino suit controller and had it feed RGB values via serial port to NodeMCU dev board (the “master”). I wrote some simple LUA code to set the device up as an Access Point, then send UDP broadcast packets out every time it received a set of RGB values over the serial port. I also created a “slave” by flashing a basic ESP-01 with a version of firmware that supports WS2811/12 (standard in latest build) and wrote some simple LUA code (based in part on an example here) to connect to the Access Point and listen for UDP broadcasts. When it got a UDP packet, it would pull the RGB bytes out and write them to a short string of WS2812’s I was experimenting with (similar to this). The result was pretty good – although the packet rate is a little low I think probably around 10hz, so 10 updates a second, or maybe a tad lower. I tried using my home wifi instead of setting up my master as an access points, but that was actually slower. I really still need to try to recreate the code using the native SDK and C (or preferably, an Arduino sketch) to see if I get better packet rates.

The real problem is – there is some latency between when a sound occurs and the data packet makes it across the serial port, then over Wifi, to the “slave”. The delay is in the neighborhood of a half a second. As a result, the lights are sync’d to the music, but the response lags by about a half second. The delay is more obvious when you ahve it next to one of my XBee “slave” devices, since those respond with almost no lag at all😦

My first thought was that I could introduce a hardware chip that does audio delay – I did something with a “bucket brigade” chip years ago – or maybe just by an off-the-shelf delay circuit. I didn’t really find anything I like much off-the-shelf…mostly guitar-oriented stuff that was pretty pricey (and big). Then I ran across Teensy! The Teensy (3.2) is a little dev board, with your usual array of I/O pins, but it’s got a 72Mhz ARM processor as well as ADC (analog-to-digital) and DAC (digital to analog) ports. You basically write Arduino sketches and use the “Teensyduino loader” to load your code onto the Teensy. It’s inventor, Paul Stroffregen, seems to be pretty into Audio or something, because he’s got a stupendous audio library and he provides a really cool online tool (built on Node-Red) to wire audio flows together graphically then export code. There are simple examples for wiring line-in and line-out to the ADC and DAC in the help that accompanies the nodes.

Here’s an example, taking and ADC input, piping it through a delay, then feeding the result the the DAC:

Then you just add little bits of code to do stuff like set the delay on channel 1:

delay(0, 300); // delay channel 0 300ms

Since the Teensy has plenty of horsepower, so it can also me my master processor, replacing the Arduino. And Teensy also does FFT, so by using it as my master processor, I no longer need the MSGEQ7 hardware to due octave analysis like in V1!

That’s a lot of stuff going on in my new master controller, so I also decided I would drive my suit, which used to be connected directly to the Triac board in my master controller, via a new “slave”, which would just house an ESP8266 as a UDP receiver and WS2811 driver, and a 12V to 5V regulator to drop by suit power supply down to the 5V the WS2811 strings want.

So here’s the high-level picture:

Bigpicture

And here’s the new master and new suit slave:

Complicated, I know.  In the next post, I’ll go into a little more detail on some of this, some of the cool things it let me do, and some additional complications I ran into.

 

Resources

Here are links to stuff that went into this build. Many of them link to Amazon, EBay, or AliExpress, so if they go stale, I apologize. I’ll try to make the descriptions usefully google-able.

Creative Inspire T12 2.0 amplified speakers: http://amzn.com/B0028N6YH0
Teensy 3.2 dev board (72Mhz ARM processor): http://www.pjrc.com/store/teensy32.html
23LC1024 SRAM chip: http://www.mouser.com/ProductDetail/Microchip-Technology/23LC1024-I-P
ESP8266 devices:
* NodeMCU Dev Board: http://nodemcu.com/index_en.html
* ESPToy 1.21: http://rayshobby.net/introducing-esptoy-1-21-with-lipo-charger/
* A bunch of plain old ESP-01’s: http://r.ebay.com/fG130b
XBee Modules: http://www.mouser.com/ProductDetail/Digi-International/XB24-AWI-001
Some cheap, little heat sinks: http://www.aliexpress.com/snapshot/6955208975.html
WS2811 RGB LED strings: http://www.aliexpress.com/snapshot/6890770046.html
2596 Buck Converter step-down regulators: http://amzn.com/B018JPGKQK
Buck/Boost Power Regulators (boost from 5v to 30V): http://amzn.com/B00WMB6OMI

 

Posted in Uncategorized | 2 Comments

Building another (better) PiCam security camera

I blogged briefly about building a security camera back in a previous post. It worked so well I decided to build another, but with a few improvements based on past experience…

I used this WiFi adapter (from Aazon): TRENDnet Wireless N 150 Mbps Mini USB 2.0 Adapter, TEW-649UB.  It’s supported out-of-the box by th OS, PLUS, you can easily pop open the housing and their a standard (tiny) wifi antenna connector on the circuit board. This let by connect a full, external antenna I had from dismantling and old router or something.

I followed this original Instuctable up to step 6:
Instead of Motion, I use VLC to stream the video.
I created two scripts in my /home/pi directory…
start-vlc.sh:
#! /bin/sh
sudo raspivid -o – -t 0 -n -w 800 -h 480 -fps 6 | cvlc -vvv stream:///dev/stdin –sout ‘#rtp{sdp=rtsp://:8160/}’ :demux=h264 :h264-fps=6 2>&1 | tee /var/log/vlc.log &
 
restart-vlc.sh:
#! /bin/sh
echo “$(date) – stopping vlc” >> /var/log/vlcstart.log
sudo ps -ef | grep -e ‘vlc \-I’ | grep -v grep | awk ‘{print $2}’ | xargs -i kill {} 2&>/dev/null || echo “vlc-streamer already stopped”
sleep 3
sudo raspivid -o – -t 0 -n -w 800 -h 480 -fps 6| cvlc -vvv stream:///dev/stdin –sout ‘#rtp{sdp=rtsp://:8160/}’ :demux=h264 :h264-fps=6 > /var/log/vlc.log 2>&1 &
 
I changed the run permissions on both files:
chmod 755 start-vlc.sh
chmod 755 restart-vlc.sh
Both scripts start the VLC streamer streaming H.264 800×400 video over RTSP at about 6 frames per second. That can all be changed, for example, you could use a different encoding..there are many examples for using different options on the internets. RTDP is fast, but be aware that one downside is that it is over UDP only, so doesn’t work if you want to stream outside your firewall.  I found I can easily go faster on the frames per second – just make sure to change both the “-fps n” and “:h264-fps=n” parameters to match. When I tried larger frame sizes though (liek 1024×768), I kept seeing it start to rebuffer. It would continue to do that, each time takin glonger and longer until it became unworkable…not sure exactly what was going on there, but 800×480 doesn’t do it, so I went with that.
I created a script in named vlc-streamer in /etc/init.d for starting and stopping the streamer as a service:
#! /bin/sh
# /etc/init.d/vlc-streamer
 
# Carry out specific functions when asked to by the system
case “$1” in
  start)
    echo “Starting vlc”
    # run application you want to start
    echo “$(date) – starting vlc” >> /var/log/vlcstart.log
    sudo -u pi /home/pi/start-vlc.sh &> /var/log/vlc.log
    ;;
  stop)
    echo “Stopping vlc”
    # kill application you want to stop
    echo “$(date) – stopping vlc” >> /var/log/vlcstart.log
    sudo ps -ef | grep -e ‘vlc \-I’ | grep -v grep | awk ‘{print $2}’ | xargs kill -9
    ;;
 
  restart)
    echo “Stopping vlc”
    echo “$(date) – stopping vlc” >> /var/log/vlcstart.log
    sudo ps -ef | grep -e ‘vlc \-I’ | grep -v grep | awk ‘{print $2}’ | xargs -i kill {} 2&>/dev/null || echo “vlc-streamer already stopped”
sleep 3
    echo “Starting vlc”
    echo “$(date) – starting vlc” >> /var/log/vlcstart.log
    sudo -u pi /home/pi/start-vlc.sh &> /var/log/vlc.log
    ;;
  *)
    echo “Usage: /etc/init.d/vlc {start|stop}”
    exit 1
    ;;
esac
exit 0

And finally, I added the startup to /etc/rc.local so that the service will start up automatically when the system starts:

# Start camera streaming
service vlc-streamer start
I set up the camera in iSpy using a custom profile. Configure it as an FFMPEG (H264) camera with an rtsp:// url pointing to your camera’s IP address and using the UDP protocol:
Image
Another deviation from the article – On my second camera, I wanted a real, durable, weatherproof enclosure – not the fake plastic kind of thing they use in the article. I used this Evertech B00LU2NPVS enclosure, again from Amazon which, for for $45 bucks, gave me a full IP66 weatherproof enclosure, plus built-in IR leds and a fan. I just glued the camera to the rubber diaphragm and the circuit board to the included mounting plate with some silicone caulk.
Note that this housing is actually QUITE large compared to the crummy little consumer WiFi camera enclosures. In fact, it was really kind of unnecessarily big for the PiCam – almost 18″ long. I ended up cutting about 4 or 5 inches off the tail end with a hacksaw, which actually worked pretty well. Because of that, I had to drill a new access hole in the bottom for the power cord. I also drilled a small hole in the back to mount the WiFi antenna connector I had scavenged.

A note about power: The IR led array and fan take 12v. The housing comes with a convenient DC “bullet” plug that connects to the IR assembly, and there is another pair of wires that come off the IR assembly to supply your camera, but they supply the same 12V. I connected these to a buck an adjustable buck regulator – a really convenient and efficient regulator that costs about 5 bucks on Amazon and can take a variety of input voltage and output whatever you adjust it to…in my case 5V. I then fed the 5V in to the USB power connector on the Pi by cutting up a USB cable and splicing it in.

 

02/16/2016 Update:

To improve stability a bit more, I tweaked the restart scripts (already reflected in the code, above). I also changed the output redirection, so it doesn’t spit stuff out to the console and redirects it to vlc.log properly.

Next, I created a script to check that the WiFi is connected, and force an attempt to reconnect if not. Then I set that up to run every five minutes via cron.

check-wifi.sh:
#! /bin/sh
count=$(ping -c 4 192.168.2.1 | grep ‘received’ | awk -F’,’ ‘{ print $2 }’ | awk ‘{ print $1 }’)
if [ $count -eq 0 ]; then
# 100% failed
echo “Restarting wifi”
sudo /sbin/ifdown wlan0
sleep 10
sudo /sbin/ifup –force wlan0
else
echo “Wifi looks good!”
fi
 
/etc/crontab:
#! /bin/sh
# Monitor remote host every 30 minutes using monitorHost
5 * * * * root /home/pi/check-wifi.sh

 

Finally, created a monitor in iSpy, so that if the connection is lost and reconnect failes, iSpy will launch a script to automatically try to restart the service.

First, I set up my pc so you can ssh to the PiCam without entering a password. After installing cygwin and making sure the cygwin\bin directory is in my default system PATH, I set up an encrypted keypair using this example on StackOverflow: http://stackoverflow.com/questions/16928004/how-to-enter-ssh-password-using-bash

Then I created a batch file to call the service restart command remotely and registerrd it as the “Reconnect Failed” Action in iSpy:

restartPiCam2CLV.bat:
rem try 5 times
set /A count = 5 
:DoWhile
 if %count% == 0 goto EndDoWhile
  set /A count = %count% -1
  ssh pi@192.168.x.yyy sudo service vlc-streamer restart
  if %errorlevel% == 0 goto EndDoWhile
  if %count% GTR 0 goto DoWhile
:EndDoWhile
exit \B
Image1

Failed connection Action

If the connection goes away and the automatic reconnect fails, iSpy will invoke the script.
Posted in Technical, Uncategorized | Leave a comment

Easy, Cheap Connector for ESP-01

One thing that’s been pretty annoying playing with the ESp-01 variant of the ESP8266 modules is the pinout form factor. The 2×4 pin configuration does not plug into a prototyping breadboard conveniently, and also doesn’t lend itself to connectors you might use in a more permanent solution very well – I was ganging two pieces of Arduino headers, using several female-ended jumper wire or gluing together multiple female ends to make a 2×8 connector…and that kind of thing.

While browsing parts at Fry’s, a really simple solution occurred to me. I got some plain old .1″ pin  spacing ribbon cable headers and a piece of ribbon cable. You could also use the prototyping ribbon cables that have male pins at the end for plugging into breadboards. (sorry for the blurry pics)

IMG_20151204_080211499

 

These work great, and cost mabe 50 cents each to make (max). Note they only had 20-pin connectors, so I only use the 8 on one side and they only had 9-conductor cable, so a got a length of that an peeled of one conductor🙂

IMG_20151204_080244917

Posted in Technical | 1 Comment

Lessons Learned with ESP-201

I recently ordered several more ESP8266 modules, and of that set, I decided to get a couple of ESP-201’s. These were appealing because they’re still fairly bare-bones (and therefore extremely cheap – about $4US each), lacking an on-bard regulator or serial chip, but they also bring out a lot more IO lines to pin pads, and seemed more convenient for plugging into a prototyping breadboard.

UPDATE : Note that, as noted on another blog, “Compared to ESP-12 this module has 6 more pins broken out – D0, D1, D2, D3, CLK, CMD (GPIO6-GPIO11). But only two of them (D2, D3) can be used as regular GPIOs after slight hardware modification.” Yuck! Looking forward to the ESP32!

I encountered a couple of issues that took a fair amount of googling to figure out, so I figured I’d pass them on.

First, as others have pointed out, the pin labels are on the underside, which is really annoying.  like most ESP8266’s, you have to pull CH_EN high, and to flash it, you have to pull GPIO0 low. You ALSO have to pull GPIO15 LOW, though, which is different from other ESP variants. Once I pulled 15 low, GPIO2 high, and GPIO0 low, I was able to flash the module.

However…I was flashing a NodeMCU image onto them, intending to keep it consistent with the other modules I’ve been playing with for my Light Suit V2 project. But it turns out, the ESP-201 is specifically designed to work with the SDK and it does NOT work with NodeMCU flash images. You pretty much much need to use the SDK or, as I do, use the Arduino plugin to handle the compiling and flashing for you.

CORRECTION: ESP-201 WORKS FINE with NodeMCU, you just have to have the pin levels
right a boot to get it into a mode where you can load .lua files and run on boot – specifically, pull CH_EN high, as you normally do, and pull their pin “IO15” (really pin 13 on the chip, I guess?) LOW. To flash, pull GPIO0 low, as usual. This wound up working and now I’m much happier🙂
ESP201_Frontesp-201_pinout

The Arduino plugin makes coding with ESP’s really easy since it builds and flashes your code to the module in one easy step without requiring you to go learn complex code build chain crap. Once I switched to that environment, the usual process of pulling GPIO0 high, followed by programming, worked well.  I haven’t really done much with it yet, but I presume it will work pretty much like any other variant using the Arduino toolset.

Posted in Uncategorized | Leave a comment

Can’t upload new LUA files once ESP8266 init.lua starts

I’ve been having this frustrating problem with my NodeMCU projects on the cheaper ESP8266 devices: Once you get into a loop in your code, or worse yet, if you have an error that causes a reboot loop, there’s no way to upload a new .lus file short of re-flashing – a major pain!

Well, I finally found a nice solution on Big Dan’s blog. Thanks to Big Dan, a simple code addition for startup of you app in init.lua gives you time to interrupt it using a simple “snippet” assigned to a button in NodeMCU.

Add some code like this to your init.lua:

function startup()
    if abort == true then
        print('startup aborted')
        return
        end
    print('in startup')
    dofile('test.lua')
    end

abort = false
tmr.alarm(0,5000,0,startup)

Then edit the code behind a “snippet” button to execute “abort=true”. Then you just have to click that snippet button before 5 seconds expire at startup time to abort starting your code.  Many thanks, Big Dan!

Posted in Uncategorized | Leave a comment