My Social media accounts

9/3/2024 - this site has moved -

301 moved

The new site is blog.cmh.sh
9/1/2024 - new blog engine now live -

Welp, the “new” blog is live. It’s running on python 3.12 with the latest build of flask and waitress. This version of my blog is containerized and actually uses a database. A very basic concept I know. I considered running the blog on the elastic container service and also as an elastic beanstalk app. The problem with both of those is that I don’t really need the extra capacity and I already have reserved instances purchased for use with my existing ec2 instances. I’m not sure how well flask works with multiple nodes, I may have to play around with that for resiliency sake. For now we are using apache2 as a reverse https proxy with everything hosted on my project box.

Todo items: SEO for posts, RSS for syndication and site map, fixing s3 access running from a docker container. Everything else should be working. There is also a sorting issue of the blog posts that I need to work out.

8/23/2024 - the end -

Linux + mac find the public ip address of the local computer:

dig +short myip.opendns.com @resolver1.opendns.com


 

I got to see what it was like at the end for my grampa today. While I like the idea of “going out on your own terms”, this was something else. Everything was filthy. Evidence of rodents was everywhere. The conditions that the strongest man I’ve ever known was living in made me sick to look at. I’m aware that he wasn’t a perfect person, but I as his grandchild, only ever knew kindness from him. It’s to late now, and the end comes to us all eventually. Everything we build will be reclaimed by nature. Grampa was a union worker, and his union pension would have allowed him and my grandma to live a healthy lifestyle right up to the end. There was more than enough money for someone to have been cleaning weekly, and yet that wasn’t happening. There was more than enough money to have someone coming to check on him daily to see if he was ok and to clean up. I’m fond of the notion of karma. Some people have a lot of bad karma coming for them.

 

8/16/2024 - Cheese on the bottom omlet and the end of an era -

My third attempt at a “cheese crust” or “cheese on the bottom” egig-white omelet turned out alright. Today’s attempt was with Mexican cheese, taco meet, ground pepper, and slap ya mama. This cheese on the bottom technique for caramelizing the cheese really brings great flavors to what is otherwise a pretty soulless egg-white omelet. Today is also the day my daughter starts her move to University. I’m not sure quite what the next four years will bring, but it will be an interesting new chapter to my life. I’ll certainly miss my movie buddie.

8/14/2024 - bulk comparing files by sha256 -

Hashing files with openssl and comparing with beyond compare.


ls split *.{py,csv}> splitlist.txt
  while IFS= read -r filename
  do

    splithash=$(openssl sha256 split/$filename | awk '{print $2}')
    sourcehash=$(openssl sha256 $filename | awk '{print $2}')
    #echo "Source hash $sourcehash and split hash $splithash"
    if [[ "$splithash" != "$sourcehash" ]] && [ -f "$filename" ]; then
      bcompare "split/$filename" "$filename"
    else
      echo "Skipping $filename as there was no change"
    fi

8/13/2024 - adding numbers with awk -

Given the results of a aws s3 ls command, lets add up the size of the files.


awk '{sum+=$2;} END{print sum;}'

Add second column to a variable called sum and then print sum.

8/11/2024 - Dynamic code generation to save time -

This weekend, I spent some time trying to look for patterns in what should be random file names. I started the process by writing some Python to read a text file with the results of an s3 ls command, break the file name into pieces for each character of the file name, and insert it into an SQLite3 database. I used some new-to-me bash techniques for code generation using a command line for loop. Along with an older sql creation by sql. This with the newer execute a command for each file line and I was in business.


#add column numbers
#for f = 1 to 50
#add 50 columns named f1 to f50
for f in {1..50}
do
  sqlite3 fileanalysis.db "alter table files add column f$f text;"
done



#loop for f = 1 to 48:
#loop for dynamic sql insert statements in mass
for f in {1..48}
do
  sqlite3 control.db "insert into file_name_position_charecter_counts (f,charecter) select distinct 'f$f', f$f from files order by f$f";
done

#loop through sql generating sql and run against our sqlite3 database.
while IFS= read -r sql
do
	sqlite3 control.db "$sql" > sqllog.txt
done < control_counts.tsv



--Create update statements to do charecter counts
select 'update file_name_position_charecter_counts set count=(select count(*) from files where '
           || c.f || '=''' || c.charecter || ''') where id = ' || cast(c.id as text)
from file_name_position_charecter_counts c;

9/9/2024 - A week of trial by fire -

So this week has been a lot. I can’t talk about most of it but there was a lot of good bash script to come out of it. Today also marks the end of the road for one of my favorite people on this planet, my Grandpa CL. May he rest in piece. This is for test best as he wasn’t doing well and If I somehow make it to 99, I hope I can go out in my own home just like him. So back to this weeks work:
New to me this week:


#grab the file list from all the s3 buckets:
#!/bin/bash
echo "Searching"
while IFS= read -r bucket
do
	if  [ ! -f "bucket_contents/$bucket.txt" ]; then

		echo "Grabbing bucket: $bucket"
		aws s3 ls "s3://$bucket" --recursive --profile events > "bucket_contents/$bucket.txt"
	fi
	
done < gs_buckets.txt

#Loop through buckets and grep for matching file names.
#!/bin/bash
echo "Searching"
while IFS= read -r bucket
do
if [ -f "bucket_contents/$bucket.txt" ]; then

echo "Searching bucket: $bucket"
while IFS= read -r filename
do
cat "bucket_contents/$bucket.txt" | gsed -E 's/^[0-9-]+\s+[0-9:]+\s+[0-9]+\s+//' | grep -e "$filename" >>filelog.txt
done < filelist.txt
fi
done < gs_buckets.txt

#Mac Sed != Linux Sed

#Cat all of the files in all of the folders
find . -type f -exec cat {} + 

#Read all files from an aws load balancer, return code 200
 find . -type f -exec cat {} +  | awk '{print $10 ","  $12 "," $13}' | grep -w 200 | sort | uniq -c | sort | awk '{print $1 "," $2}' > ../test.csv

#Take the port number off an ip address
gsed -E 's/:[0-9]+,/,/'

# Use AWK to count the number of lines between two numbers in a file.
#!/bin/bash
if [ $# -eq 3 ]; then
	topNo=$1
	botNo=$2
	total=$3
	date
	echo "Remaining:"
	left=$(awk -F'/' '{print $2}' movedir.txt | \ 
	awk -v top="$topNo" -v bottom="$botNo" '$1 > bottom && $1 <= top {count++} END {print count}')
	echo "There are $left directories left to process"
	#Do math and output the result
	pctleft=$(bc -l <<< "1-($left/$total)")
	echo "$pctleft complete"

else
	echo "(top no) (bottom no) (total)"
fi

8/5/2024 - One Time secret clone and dockerization -
This weekend started by working on a “one time secret” clone for personal use. ChatGPT got me 80% of the way there and I spent the rest of the day tweaking and improving upon the code it wrote. Sunday I set about getting the v2 of my blog production ready by introducing waitress which is the recommended method for hosting python flask web applications and then containerizing the application so that it’s easy to stay on the latest fully supported version of python which updates frequently. Cool scripts from this past weekend:
docker file:

FROM public.ecr.aws/docker/library/python:3.12

WORKDIR /tmp
# Add sample application
ADD app.py /tmp/app.py
ADD objects.py /tmp/objects.py
ADD hash.py /tmp/hash.py

COPY templates /tmp/templates
COPY static /tmp/static


COPY requirements.txt requirements.txt

RUN pip3 install -r requirements.txt

EXPOSE 8080

# Run it
CMD [ "waitress-serve", "app:app" ]
build image and run with shared folder:

#!/bin/bash

docker build --tag vacuumflask .
imageid=$(docker image ls | grep -w "vacuumflask" | awk '{print $3}')
docker run  --env-file "/home/colin/python/blog2/vacuumflask/.env" \
--volume /home/colin/python/blog2/vacuumflask/data:/tmp/data \
-p 8080:8080 \
 "$imageid"
Build beanstalk zip:

#!/bin/bash
destination="beanstalk/"
zipPrefix="vacuumflask-"
zipPostfix=$(date '+%Y%m%d')
zipFileName="$zipPrefix$zipPostfix.zip"
mkdir "$destination"
cp -a templates/. "$destination/templates"
cp -a static/. "$destination/static"
cp app.py "$destination"
cp Dockerfile "$destination"
cp hash.py "$destination"
cp objects.py "$destination"
cp requirements.txt "$destination"
cd "$destination"
zip -r "../$zipFileName" "."
cd ../
rm -r "$destination"
scp "$zipFileName" project:blog2
scp docker-build-run.sh project:blog2
ssh project
Next weekend I’ll need to figure out how to get it working with elastic-beanstalk and then work on feature parity.
7/28/2024 - Syncing data from the old blog to the new blog -

This morning I automated my data sync between the old blog and the data storage system for the new one. This will allow me to keep up on how my newer posts will look on the pages I’m building as I slowly replace the existing functionality.


#!/bin/bash
# copy the files from my project box to a local data folder
scp -r project:/var/www/blog/blogdata/ /home/colin/python/blog2/vacuumflask/data/
# read the blog.yml file and export the ids, then remove extra --- values from stream
# and store the ids in a file called blog_ids.txt
yq eval '.id' data/blogdata/blog.yml | sed '/^---$/d' > data/blogdata/blog_ids.txt
# loop through the blog ids and query the sqlite3 database and check and see if they exist
# if they do not exist run the old_blog_loader pythong script to insert the missing record.
while IFS= read -r id
do
  result=$(sqlite3 data/vacuumflask.db "select id from post where old_id='$id';")
  if [ -z "$result" ]; then
    python3 old_blog_loader.py data/blogdata/blog.yml data/vacuumflask.db "$id"
  fi
done < data/blogdata/blog_ids.txt
# clean up blog ids file as it is no longer needed
rm data/blogdata/blog_ids.txt
echo "Done"

7/27/2024 - flask templates and orm updates -

After getting chores done for the day I set about working on my new blog engine. This started out with getting flask templates working and after some back and forth that was sorted out. It then set in that I was repeating myself a lot because I skipped an ORM model. So I set about to write a blog class to handle loading, serialization, updates, and inserts. A round of testing later and a bunch of bugs were squashed. A side quest today was to update all of the image paths from prior blog posts to use my CDN. I ended up using a combination of awk commands [ awk '{print $7}' images.txt > just_images.txt and awk -F '/' '{print $3}' image_names.txt > images2.txt] to get a good list of images to push to the CDN and then asked chatgpt to help me write a bash loop [ while IFS= read -r file; do aws s3 cp "$file" s3://cmh.sh; done < images2.txt ] to copy all the images. bash loop using chatgpt I’ve seen one of my more Linux savvy coworkers write these loops on the fly and it is always impressive. I streamed the first couple hours of development and encountered a number of bugs with Raphael bot that I’ll see about fixing tomorrow.

7/22/2024 - Lego F1 Shadowbox Build -

7/14/2024 - Raphael bot obs scene switching -

Since the weather was suppose to be hotter today and it was suppose to storm throughout the day, I opted to work on my tech projects today. I started the day by streaming my work on Raphael bot, where I quickly was able to fix the issue where Raphael was ignoring me. Next I turned my attention to the scene switching command. After adding several variants to the list of commands I was able to get a demo working quite well on stream and pushed the changes to my GitHub Repository. With Raphael being in a good place, I turned my attention to my new flask based blog engine. After some troubleshooting I was able to find a problem, and figured out that part of the issue with my curl test commands was that I wasn’t saving the cookie that curl was getting from the login api call. With my curl issues solved, I turned to obscuring my blog user password when calling my curl commands. The last thing I worked on before turning my attention to a discord movie watch party was adding a file upload method to my blog API. With the hours of my Sunday winding down, I turned my attention to the documentation for Raphael since I noticed that someone had started following the repository. raphael flow chart
Demo:

@makersganamake

streaming AI bot update. Now with obs scene switching and captions.

♬ original sound - Colin

6/30/2024 - Economics of transcription -

Today’s focus was on getting closed captioning working for Raphael bot. The solution ended up being switching the type of text I was creating and not worrying about the layout and location of the text. Left to the user to adjust the text location, adding the code to send all the transcriptions to the stream was not a big deal. The rest of my time was spent cleaning up the code and documentation. I also dropped a comment on my tiktok showing off the bot on tik-tok that the source code was available on GitHub. The other thing that I learned today was that Raphael is too expensive for most full time twitch streamers. I also dropped a comment on my TikTok showing off the bot on TikTok that the source code was available on GitHub.. The AWS transcription service costs about $0.024 cents per minute. This means that a typical full-time Twitch streamer would spend over $40 a week just on the transcription service. I suppose it’s a good thing that the code is open-source.

aws transcription is expensive
6/29/2024 - Adventures in PHP -

This week has seen a lot of my spare time trying to finish solving a problem I encountered at last weekend's tail end. Why my EC2 vm with an IAM profile that should grant it access to an s3 bucket can’t use the AWS PHP SDK to connect to the S3 bucket. I kept getting variations of this exception:
 

[Fri Jun 28 22:47:50.774779 2024] [php:error] [pid 38165] [client 1.2.3.4 :34135] PHP Fatal error: Uncaught Error: Unknown named parameter $instance in /home/notarealuser/vendor/aws/aws-sdk-php/src/Credentials/CredentialProvider.php:74\nStack trace:\n#0 /home/notarealuser/vendor/aws/aws-sdk-php/src/Credentials/CredentialProvider.php(74): call_user_func_array()\n#1 /home/notarealuser/vendor/aws/aws-sdk-php/src/ClientResolver.php(263): Aws\\Credentials\\CredentialProvider::defaultProvider()\n#2 /home/notarealuser/vendor/aws/aws-sdk-php/src/AwsClient.php(158): Aws\\ClientResolver->resolve()\n#3 /home/notarealuser/vendor/aws/aws-sdk-php/src/Sdk.php(270): Aws\\AwsClient->__construct()\n#4 /home/notarealuser/vendor/aws/aws-sdk-php/src/Sdk.php(245): Aws\\Sdk->createClient()\n#5 /var/www/blog/s3upload.php(35): Aws\\Sdk->__call()\n#6 {main}\n thrown in /home/notarealuser/vendor/aws/aws-sdk-php/src/Credentials/CredentialProvider.php on line 74


 

I didn’t want to but I ended up adding the aws cli to my test box and confirmed that I could indeed access my aws bucket without using any hard coded credentials on the box using the IAM profile for the ec2 running this code. I ended up calling the AWS CLI from my code directly. This isn’t ideal, but I’ve wasted enough time this week fighting with this bug. In other news ChatGPT is pretty fantastic at writing regular expressions and translating english into sed commands for processing text data. Because I had to use the AWS CLI, I was getting the contents of my S3 Bucket back as text that wasn’t in a format that was ideal for consuming by code. Here is the prompt I used and the response.

chatgpt doing the hard part

I validated that the sed was correct on the website https://sed.js.org/

sed validation tool online, very cool

ChatGPT also provided a useful breakdown of the sed command that it wrote. explination of sed command

@makersganamake Use ChatGPT to simplify writing sed commands in Linux. #dev #chatgpt #linux #devops #sysengineer #sysadmin ♬ original sound - Colin

One more tool for my toolbox for working with files in nix environments.


 

6/24/2024 - Social tooling -

This weekend, I got a hankering to improve my domains and web tooling. Last week, I purchased some new domains, and now it's time to put them to work. My first project was a URL shortener service for my use. I built this using a small bit of PHP, an "htaccess" file with an Apache rewrite rule to clean up the URL parameters. Under the hood, I decided to use an SQLite3 database as a backend. The shortener service will be shared between a couple of my sites, with the primary user being the shorter domain name. My second project was setting up AWS Cloudfront for the Shortner domain to host media that I'd like to share. The next step will be to either swap out my blog's media system or build one just to upload to the s3 bucket behind AWS Cloudfront. 

6/22/2024 - Raphael bot update -

Before I finished my Juneteenth holiday break I posted the source code for Raphael bot out on github. I got the voice recognition working and responding in a timely manner. It even outputs it’s voice to obs studio. For some reason I was feeling brave and posted a tik-tok with the video . The next step will be to implement the streaming commands in a manner that allows customization. I also added a readme.md which still has a lot to be desired as I haven’t done any testing outside of Ubuntu 22.04. This weekend I randomly decided to make myself a url shortner for a new domain I acquired. Finished that up today.

6/6/2024 - Twtich Raphael_bot -

I've been working on a Twitch bot for a couple weeks now. I've got transcription mostly working. So far, I have gotten chat messages going to Twitch chat, and the bot recognizes it's name. When it recognizes its name and calls ChatGPT to answer questions for the streamer.

3/9/2024 - new year new data -

2023 came and went in a flurry of Zelda breath of the wild and then I got pulled into Boulders Gate 3 for a solid six months and next thing I new it it was 2024. I would have posted sooner but there were php 8 compatability issues that i had to solve and then had some file upload issues with this site. Got everything working a few weeks ago and then I ended up finishing up the cutting board for my wife without much fan fare as i didn't record any of it for social media. Today I spent some time working on automaticly resizing photos for the blog so i don't have to mess with photoshop or gimp to get content out quickly.


The other project that I took on today was to document the differences in my Saturday morning Tik-Tok topics from this year vs around the same time last year. Seems like Tik-tok shop content has replaced more than 1/3 of the organic content that Tik-tok served up around the same time last year.
breakdown of tiktok topics by source over 40 minutes

2/11/2023 - terraform vs kubernetes an under the weather Saturday -
The home lab:
a picture of the home lab resources0% of 120 cpu, 4% of 660GB of ram, 1% of 12TB of storage
a list of three servers, Dell r530, r630, r730xd
Saturday plans
We had a health screening this morning and I was already a bit under the weather. I have a project coming up for work where we will be using kuberneties and we already have many projects that use terraform. A few weeks ago I attended a good terraform training put on by aws. After updating my local home lab, I'm going to try to stay motivated to dig into the two tech stacks above.
Going to try and work through this blog on proxmox and kuberneties. Running a Kubernetes Cluster Using Proxmox and Rancher
Update - Saturday night - This project went ok at first, I got all 3 linux machines created. I followed the instructions and got the cluster "provisioning" in ranger but it never made it out of that state. I later discovered that the "main" kube controller ran out of disk space with the 20gb of space that the directions had. For some reason the lvm had only set aside 10gb for the main drive. Try as I might I managed to get the physical drive extended but the LVM was being a pain in the ass. I ended up nuking the main and i accidently nuked one of the workers while rage quiting. If I feel better in the morning I'm going back to woodworking. My first day with kuberneties didn't go well although ranger looks neat.
2/6/2023 - Viral voice over -
Well that was unexpected and probably unrepeatable.
7k views on this past weekends video
2/5/2023 - now with kekywords for seo -

Playing around with the admin side of this site. Added meta keywords on posts. Still thinking through how those might look on the main page.

2/5/2023 - Chunky Cutting boards weekend 2 -
Last weekend I saw a tik-tok on a high quality low cost audio system for postcasting. Now I have no interest in doing a podcast and I suspect that the market is already saturated with CIS-white-male-in-tech opinions. Anyway, for this weekend's cutting board video I tried something new. I did a voice over with the first bit being about wood working and the second bit talking about disaster recovery in IT. I posted the video and went out to dinner. What was interesting is that a former co-worker who is in a band and is an audio engineer liked the reel on instagram and suddenly the video was doing better numbers than I've seen on tik-tok. Snapchat continues to difficult to break into. I've seen oodles of trash, bad, discusting, videos on that platform and yet I can't get my content to get picked up. I've seen a lot of video's of people doing stuff wrong or people failing at stuff so maybe i'll try something snapchat specific and see if I get any traction. Since I'm not woodworking today I figured I'd tune up some bits of this blog and do some housekeeping on the images on the site. Happy Sunday.
tik-tok analytics instagram analytics which went suprising well This weekendschunky cutting board no 1, part 2.
1/29/2023 - Chunky Cutting boards weekend 1 -
cutting board

I got a late start Saturday and as if the vibe of the morning was following into the workshop, drama followed. This is the first time I've seen my miter saw bind up cutting anything. Typically it goes right through whatever you put under it's blade. The same figured grain also proved to be too much for the plainer as it too spit out the hard maple burnt to a crisp. After a lot of hang planning and a lot of standing, I resumed work on the first of five cutting boards. I decided on resawing the walnut to the thickness I wanted for the strips going through the cutting board. I manage to forget to tighten the band saw blade before I started cutting. Figured it out pretty quickly and adjusted to give myself more room for jointing. I ended up with an extra strip of walnut 3x a bit less than quarter of an inch. I'm thinking I may use this strip in some sort of bent lamination if I'm feeling adventures when i get toward the end of this project. I won't get to really see how this first glue up went until next weekend. Hopefully there is less drama then. A weekend of work in 3 minutes.

1/22/2023 - Thick cutting board planning -
boards in a car

Last weekend I was tied up with work, and then under the weather. This weekend I went to visit the fine folks at Owl hardwoods. A few years ago I made a batch of hard maple and walnut cutting boards. This year I think i'm going to do a batch of thicker cutting boards and see if I can up the complexity a bit and raise the quality. This year I'm starting with a nicely figured piece of 6/4 maple with a lot of character, and a piece of 8/4 walnut. My next step was to rough out a basic plan in fresco, it's been a minute since I've used it so there was a bit of a learning curve to pick it back up again, but shortly i was up and running again. I was hoping to start work on it today but woke up with a headache. Next weekend I'll start the cuts on this next batch of cutting boards.

idea of what the first cutting board will look like
1/8/2023 - Hammer time -

This year I’d like to get more exercise and spend more time building things in the real world. This past Sunday I spent the afternoon creating a Padauk handle for a brass chizel hammer. If I were to do it again, I’d spend less time standing and more time with a spoke shave. Video of the build process: https://www.tiktok.com/t/ZTRgSHg36/

Here is the finished product:

a sexy hammer

Time will tell how this handle holds up, I’ll have to think up a project that involves chisels to give me an excuse to try this out. 

12/28/2022 - Adobe Fresco -

adobe fresco review While Adobe photoshop has been the gold standard for photo editing for as long as I can remember, I understand why they bought Fresco. I learned how to use fresco with a five minute video and I was adding absent family members into pictures. The integration with the Apple Pencil is also top notch and it’s actually a mostly intuitive experience for an OG with paint shop pro X experience.

12/27/2022 - adventures in base javascript and html -
js es 2022

I've been playing around with bare bones javascript and html this break and I have to say It's dramatic how much it's improved since the last time I was doing full time web development. All of the bits that used to require a bunch of extra libraries to do basic web dev are now cooked in. It actually feels like a decent language to work with. Wonder if Javascript math is still clunky...

12/23/2022 - new login testing -

Just testing a new system for authentication across multiple pages.

12/23/2022 - wysiwyg test 1 -

A secondary heading

bold text

something something something

https://links.colinhayes.dev 

  1. point
  2. another point
  3. third

block quote

another quote

     
     
     
12/22/2022 - admin now with banhammer -
Admin interface now with ban hammer
12/22/2022 - blog admin interface -
Spent the day working on the admin side of this blog. I can now post without remoting into my server.
12/18/2022 - 2023 Year of the linux laptop -
So a while back I decided to make the switch on my personal laptop from windows 11 which was driving me crazy to Ubuntu 22.x. Since doing so I haven't looked back and today I set up RClone to sync my dropbox, onedrive, and s3 buckets. The next order of business was to get ClamAV going. I haven't fully uninstalled windows 11, since It's a paid version, it's still sitting on a smaller hard drive on this laptop, and whenever I want to use it I have to flip some bios settings and reenter my bitlocker key which is a pain. While going full linux at work would be difficult because of how much I use native desktop Excel, for personal use I haven't had any major issues. Because Dell supports up to Ubuntu 20 I was even able to get the finder print reader working on Linux. This was the only native feature that didn't work upon install.
12/17/2022 - adventures in ceph -
Last summer I started on a project to create a three node ceph cluster based on the odroid M1 platform. I followed the instructions provided by the following blogs: It went, mostly ok, but never really worked like it was intended. The goal was to have 512GB of highly available, redudant storage, with three separate UPS systems supporting them. At this point I'm considering just tearing all three nodes down and starting over with something else.
12/11/2022 - Hello World -
Welp.. I made my own blog engine.
Who needs a cms to sit around and get hacked.
Not this guy.

RSS

}