K40 gets some long overdue upgrades and testing

K40_overdue_updates_head

I have had some upgrades sitting on the shelf for a long time for the K40 laser cutter I got years ago. Some have been around since I originally got the machine, but I wanted to get all the life from the stock parts. I finally installed them.

The upgrades.

K40_overdue_updates_lenses
  • LightObjects Air assist head
  • 18mm ZnSe 50.8mm FL lens
  • fresh 20mm primary mirror
  • 5mp USB camera for Lightburn

Just a short list but they make a big difference over the stock 12mm lens and head. I got them installed along with a 5mp camera to pair with Lightburn to assist with object positioning.

If you are not running Lightburn software with your laser cutter. You should be. It is bar none the best with very responsive development team. I am a big supporter of open source, but pay for the license to support those guys. It is worth it, and my contribution via renewal barely buys them beers. You cannot go wrong for the price and what you get with a responsive development team. They read their forum and answer questions. The great folks at LightBurn also make their 3d model available for the 5mp camera holder, and will sell you one from their store if you don’t have a printer. It was a bit too tall for the LightObject Air assist head, so I had to go low profile with this model from Prusa Printers dot org. I’m still fiddling with the knobs on the calibration but visual location of cuts is going to be a game changer. I can cut small object from scraps now.

Testing the new Stuff

Prior to the upgrades my K40 had issues with living hinge cutting. It may have been my settings or focal length setting. However the new Air Assist head and Z-table combined to make a great hinge with minor scorching which was my big issue before. Charcoal does not flex well. Big shout out to DenzilMakes.com for the great test file box. You should check out his very nice work. There are lots of cool designs on the site. It made it look easy because he did all the hard work. There are even instructions on assembly available via his blog.

Sneak Peak for the future.

I did this long over due blog update while I was waiting for a fresh compile of OpenCV. I have been wanting to Open Computer Vision to convert stereo images to point clouds for a long while, for 3D modeling of real life. Well as a cheapskate, I could not justify buying a 3D scanner or Stereo Camera. What to do? Well check the parts drawer for what I have. A plain RaspberryPi zero and a RaspberryPi Zero W had been gifted to me years ago by a friend who wanted more power. I had some “Original” V1 camera modules to, well maybe they are knockoffs, but I cannot tell. The final physical piece was an OTG data transfer cable for cell phones. This will provide better resolution that my failed ESP32CAM attempt. I really should document failures too.

I got the basic Pi on the network by connecting it to the Pi W via the Ethernet gadget. I followed Adafruit’s guide for the most part. I had to do static IP addressing on the USB0 network on both of the Pies, and a little iptables forwarding. Note do not forget to enable forwarding in the /etc/sysconf.conf. I got everything working, after fighting with the camera cables a bit.

I then used Max Davaev of PiKVM’s great ustreamer to get the cameras accessible to the network for stream capture. This low latency streamer is just the ticket for the lightweight Pi’s, but full frame rate video may be a bit much for the available memory. Single image capture however is very responsive at full resolution.

I worked on my industrial design and worked on a FreeCad parametric model of a “case” for the things and it only took 3 revisions to get it usable. The cameras are 66mm apart or the normal human eye spacing, but the design will support up to 110mm for better stereo vision. I didn’t cut out the lens opening for the wider setting, but the design is parametric so one edit on the spreadsheet and a re-export and conversion of the STL into a SVG for the laser cutter. Props to tinker cad for the easy conversion from stl to svg via import and export.

I hope to update once I get some point clouds captured and converted to meshes. I hope to spawn the mesh processing onto the ARM BigLittle cluster which recently got a OS upgrade. end of word vomit… I saw a great makers site that is short sweet and too the point. I should really copy that style. “raw data for raw nerves

Ansible Tower for the ARM cluster

I had been using Ansible from the command line interface to ensure all my nodes were installed the same.   I have to give credit to the companies that provide a license that is good for testing and development of their centralized management.  While it is limited it is great for testing in my ARM Big Little cluster.

“Self-support trial license that will not expire. Does not include features in Standard and Premium Ansible Tower, such as LDAP and Active Directory support, system tracking, audit trails and surveys.”

So I have spun up a x86 CentOS 7 VmWare VM so I could have a compatible platform to run Ansible Tower and the basic install is done and tested.   Now to begin the projects of configuring Projects, Inventories, and Roles so that node deployment and management of my nodes can be automated.

Docker swarm on Odroid XU4Q

Sometimes the easiest way to learn new things is monkey see monkey do.   So I started with the guide Docker Swarm, and updated it for my ARM bigLITTLE cluster.

The only major update from the MC1 guide is that the busybox image was no longer in the git hub or would not run so I used a different image.  It works as expected and if one of the nodes goes down the web “httpd” service spawns on a new node and starts responding to request.     I also changed the port for “Visualizer” as I had a conflict with another project running on the cloudshell2 data node of my cluster.

root@cloudshell2:~# docker service ls
ID NAME MODE REPLICAS IMAGE
3teh7qj0jist dsv replicated 1/1 alexellis2/visualizer-arm:latest
m44k0ueudh8b httpd replicated 3/3 hypriot/rpi-busybox-httpd:latest

My next step is to work on my own web service as I prefer nginx.     I have also built the mentioned 2+2 GlusterFS and plan to use this to run my containers from.

This has been a great monkey do, and I can see why docker is gaining traction in the enterprise, so expect to see more projects based on it.

ARM bigLittle Cluster

I started building a new cluster lab and went in for the ARM bigLITTLE CPU based on the  Odroid-XU4Q from odroidinc and hardkernel.com.   It is much cheaper to run all the time and allows me to test new things.  It is currently configured with…

TODO:

  • mariadb/mysql on Cluster with share file system.
  • Custom write my own Docks and/or containers.
    • Nginx Web server
    • App server or node.js server
    • containerize DB from above
  • Maybe switch to OpenStack
  • Ansible tower testing ground.

It is currently one Cloudshell2 with 4T of spinning media in a RAID 1, and 4 Odroid-XU4Q in a custom enclosure with eMMC boot devices for each node  and each node also has a SD drive in a 2×2 glusterFS mirrored stripe across all 4.   All connected via Gigabit Ethernet on a Cisco 3650.   I have one Raspberry Pi 3 node for  an odd man out to test ansible playbooks.