DiscoverHacker Public Radio
Hacker Public Radio
Claim Ownership

Hacker Public Radio

Author: Hacker Public Radio

Subscribed: 822Played: 26,985
Share

Description

Hacker Public Radio is an podcast that releases shows every weekday Monday through Friday. Our shows are produced by the community (you) and can be on any topic that are of interest to hackers and hobbyists.
244 Episodes
Reverse
Overview In the last episode we looked at how JSON data is structured and saw how jq could be used to format and print this type of data. In this episode we'll visit a few of the options to the jq command and then start on the filters written in the jq language. Options used by jq In general the jq command is invoked thus: jq [options...] filter [files...] It can be given data in files or sent to it via the STDIN (standard in) channel. We saw data being sent this way in the last episode, having been downloaded by curl. There are many options to the command, and these are listed in the manual page and in the online manual. We will look at a few of them here: --help or -h Output the jq help and exit with zero. -f filename or --from-file filename Read filter from the file rather than from a command line, like awk´s -f option. You can also use ´#´ to make comments in the file. --compact-output or -c By default, jq pretty-prints JSON output. Using this option will result in more compact output by instead putting each JSON object on a single line. --color-output or -C and --monochrome-output or -M By default, jq outputs colored JSON if writing to a terminal. You can force it to produce color even if writing to a pipe or a file using -C, and disable color with -M. --tab Use a tab for each indentation level instead of two spaces. --indent n Use the given number of spaces (no more than 7) for indentation. Notes The -C option is useful when printing output to the less command with the colours that jq normally generates. Use this: jq -C '.' file.json | less -R The -R option to less allows colour escape sequences to pass through. Do not do what I did recently. Accidentally leaving the -C option on the command caused formatted.json to contain all the escape codes used to colour the output: $ jq -C '.' file.json > formatted.json This is why jq normally only generates coloured output when writing to the terminal. Filters in jq As we saw in the last episode JSON can contain arrays and objects. Arrays are enclosed in square brackets and their elements can be any of the data types we saw last time. So, arrays of arrays, arrays of objects, and arrays of both of these are all possible. Objects contain collections of keyed items where the keys are strings of various types and the values they are associated with can be any of the data types. JSON Examples Simple arrays: [1,2,3] [1,2,3,[4,5,6]] ["Hacker","Public","Radio"] ["Sunday","Monday","Tuesday","Wednesday","Thursday","Friday","Saturday"] Simple object: { "name": "Hacker Public Radio", "type": "podcast"} This more complex object was generated by the Random User Generator API. It is a subset of what can be obtained from this site. { "gender": "female", "name": { "title": "Mrs", "first": "Jenny", "last": "Silva" }, "dob": { "date": "1950-01-03T21:38:19.583Z", "age": 74 }, "nat": "GB" } This one comes from the file countries.json from the Github project mledoze/countries. It is a subset of the entry for Mexico. { "name": { "common": "Mexico", "official": "United Mexican States", "native": { "spa": { "official": "Estados Unidos Mexicanos", "common": "México" } } }, "capital": [ "Mexico City" ], "borders": [ "BLZ", "GTM", "USA" ] } Identity filter This is the simplest filter which we already encountered in episode 1: '.'. It takes its input and produces the same value as output. Since the default action is to pretty-print the output it formats the data: $ echo '["Hacker","Public","Radio"]' | jq . [ "Hacker", "Public", "Radio" ] Notice that the filter is not enclosed in quotes in this example. This is usually fine for the simplest filters which don't contain any characters which are of significance to the shell. It's probably a good idea to always use (single) quotes however. There may be considerations regarding how jq handles numbers. Consult the jq documentation for details. Object Identifier-Index filter This form of filter refers to object keys. A particular key is usually referenced with a full-stop followed by the name of the key. In the HPR statistics data there is a top-level key "hosts" which refers to the number of currently registered hosts. This can be obtained thus (assuming the JSON is in the file stats.json): $ jq '.hosts' stats.json 357 The statistics file contains a key 'stats_generated' which marks a Unix time value (seconds since the Unix Epoch 1970-01-01). This can be decoded on the command line like this: $ date -d "@$(jq '.stats_generated' stats.json)" +'%F %T' 2024-04-18 15:30:07 Here the '-d' option to date provides the date to print, and if it begins with a '@' character it's interpreted as seconds since the Epoch. Note that the result is in my local time zone which is currently UTC + 0100 (aka BST). Using object keys in this way only works if the keys contain only ASCII characters and underscores and don't start with a digit. To use other characters it's necessary to enclose the key in double quotes or square brackets and double quotes. So, assuming the key we used earlier had been altered to 'stats-generated' we could use either of these expressions: ."stats-generated" .["stats-generated"] Of course, the .[<string>] form is valid in all contexts. Here <string> represents a JSON string in double quotes. The jq documentation refers to this as an Object Index. What if you want the next_free value discussed in the last episode (number of shows until the next free slot)? Just typing the following will not work: $ jq '.next_free' stats.json null This is showing that there is no key next_free at the top level of the object, the key we want is in the object with the key slot. If you request the slot key this will happen: $ jq '.slot' stats.json { "next_free": 8, "no_media": 0 } Here an object has been returned, but we actually want the value within it, as we know. This is where we can chain filters like this: $ jq '.slot | .next_free' stats.json 8 The pipe symbol causes the result of the first filter to be passed to the second filter. Note that the pipe here is not the same as the Unix pipe, although it looks the same There is a shorthand way of doing this "chaining": $ jq '.slot.next_free' stats.json 8 This is a bit like a file system path, and makes the extraction of desired data easier to visualise and therefore quite straightforward, I think. Array index filter We have seen the object index filter .[<string>] where <string> represents a key in the object we are working with. It makes sense for array indexing to be .[<number>] where <number> represents an integer starting at zero, or a negative integer. The meaning of the negative number is to count backwards from the last element of the array (which is -1). So, some examples might be: $ echo '["Sunday","Monday","Tuesday","Wednesday","Thursday","Friday","Saturday"]' | jq '.[1]' "Monday" $ echo '["Sun","Mon","Tue","Wed","Thu","Fri","Sat"]' | jq '.[-1]' "Sat" $ echo '[1, 2, 3, [4, 5, 6]]' | jq '.[-1]' [ 4, 5, 6 ] We will look at more of the basic filters in the next episode. Links jq: GitHub page Downloading jq The jq manual Wikipedia page about the jq programming language Test data sources: Random User Generator API Github project mledoze/countries
Today I Learnt, sed hold/pattern space use. Sgoti talks about using sed hold/pattern spaces. Tags: TIL, sed I fixed the ${ls} /usr/bin to ${ls} ${bindir} issue mentioned in the show. #!/bin/bash # License: GPL v3 # This program is free software: you can redistribute it and/or modify # it under the terms of the GNU General Public License as published by # the Free Software Foundation, either version 3 of the License, or # (at your option) any later version. # # This program is distributed in the hope that it will be useful, # but WITHOUT ANY WARRANTY; without even the implied warranty of # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the # GNU General Public License for more details. # # You should have received a copy of the GNU General Public License # along with this program. If not, see <https://www.gnu.org/licenses/>. #Name: grab-bin.sh #Purpose: Link your binaries. #Version: beta 0.07 #Author: SGOTI (Some Guy On The Internet) #Date: 2023-12-17 #variables: bindir=/usr/bin/ awk=${bindir}awk cat=${bindir}cat chmod=${bindir}chmod date=${bindir}date echo=${bindir}echo find=${bindir}find ls=${bindir}ls mktemp=${bindir}mktemp sed=${bindir}sed uniq=${bindir}uniq #start: ${echo} -e "\nStep 0: $(${date} +%F), $(${date} +%T)"; # Create the /tmp/ directory to place the files. function mkt (){ if [ -d /tmp/$(${date} +%F).* ]; then tmpdir1=$(ls -d /tmp/$(${date} +%F).*) ${echo} -e "The directory already exists.\n${tmpdir1}" else tmpdir0=$(${mktemp} -d /tmp/$(${date} +%F).XXXXXXXX) tmpdir1=${tmpdir0} ${find} "${tmpdir1}" -type d -exec ${chmod} -R =700 {} +; ${echo} "Had to create ${tmpdir1}" fi } mkt ${echo} -e "\nStep 1: $(${date} +%F), $(${date} +%T)"; # Files created by this script. tmpdoc0=${tmpdir1}/$(${date} +%Y%m%d)variables.txt tmpdoc1=${tmpdir1}/$(${date} +%Y%m%d)bash.vim tmpdoc2=${tmpdir1}/$(${date} +%Y%m%d)sed-script.sed # Here-document to build the first document (variables.txt). ${cat} > ${tmpdoc0} << "EOL0" bindir=/usr/bin/ EOL0 # variables.txt body. ${ls} -1 ${bindir} | ${sed} -n ' h s/[^0-9a-zA-Z]//g G s/\n/ / s/(.*) (.*)/1=${bindir}2/p ' >> ${tmpdoc0} ${sed} -i '/[/d' ${tmpdoc0} ${echo} -e "\nStep 2: $(${date} +%F), $(${date} +%T)"; # Bash.vim here-document. ${cat} > ${tmpdoc1} << "EOL1" iabbr case; case ${var_name} in <CR> [yY]) <CR> ${echo} 'User said, "Yes"'; <CR> ;; <CR> <CR> [nN]) <CR> ${echo} 'User said, "No"'; <CR> ;; <CR> <CR> [qQ]) <CR> ${echo} "Let's get outta here."; <CR> exit <CR> ;; <CR> <CR> *) <CR> ${echo} "Good Heavens! Someone broke the script I'm writing."; <CR> exit <CR> ;; <CR>esac iabbr here; ${cat} << _EOD_<CR>_EOD_<CR><ESC>2k0 iabbr func function NAME () {<CR><CR>}<UP> iabbr if; if []; then<CR><ESC>Ielse<CR>${echo} "Good Heavens!"<CR><ESC>Ifi<ESC>4k0A iabbr ali; alias NAME=''<ESC>B iabbr ; ()<Left><Left> EOL1 # bash.vim body. ${ls} -1 ${bindir} | ${sed} -n ' { h s/[^0-9a-zA-Z]//g G s/\n/ / s/(.*) (.*)/iabbr 1 ${2}/p } ' >> ${tmpdoc1} # Bash.vim here-document second pass. ${cat} >> ${tmpdoc1} << EOL1-5 iabbr vars; bindir=/usr/bin/ <CR> EOL1-5 # bash.vim body second pass. ${ls} -1 ${bindir} | ${sed} -n ' { h s/[^0-9a-zA-Z]//g G s/\n/ / s/(.*) (.*)/\<CR>1=${bindir}2/p } ' >> ${tmpdoc1} ${sed} -i '/{[}/d; /${bindir}[/d' ${tmpdoc1} ${echo} -e "\nStep 3: $(${date} +%F), $(${date} +%T)"; # Sed script here-document. ${cat} > ${tmpdoc2} << "EOL2" #!/usr/bin/sed -f EOL2 # Sed script body. ${ls} -1 ${bindir} | ${sed} -n ' h s/[^0-9a-zA-Z]//g G s/\n/ / s/(.*) (.*)/s/\<2\>/${1}/g/p ' >> ${tmpdoc2} ${sed} -i '/[/d' ${tmpdoc2} ${find} "${tmpdir1}" -type d -exec chmod -R =700 {} +; ${find} "${tmpdir1}" -type f -exec chmod -R =600 {} +; ${echo} -e "\nStep 4: $(${date} +%F), $(${date} +%T)"; exit; Source: In-Depth Series: Learning sed Source: In-Depth Series: Today I Learnt This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Some stuff I use to help make APIs https://github.com/freeload101/Python/blob/master/Python_Includes_RMcCurdy.py JAMBOREE.rmccurdy.com for burp suite. Have I done a podcast on JAMBOREE? I must have... If not I will
New hosts Welcome to our new host: Dave Hingley. Last Month's Shows Id Day Date Title Host 4086 Mon 2024-04-01 HPR Community News for March 2024 HPR Volunteers 4087 Tue 2024-04-02 Getting started with the digiKam photo management software Henrik Hemrin 4088 Wed 2024-04-03 Today I Learnt more Bash tips Some Guy On The Internet 4089 Thu 2024-04-04 Modifying a Python script with some help from ChatGPT MrX 4090 Fri 2024-04-05 Playing Civilization III, Part 1 Ahuka 4091 Mon 2024-04-08 Test Driven Development Demo norrist 4092 Tue 2024-04-09 More man-talk. Some Guy On The Internet 4093 Wed 2024-04-10 Installing postmarketOS on a PINE64 PinePhone Claudio Miranda 4094 Thu 2024-04-11 One year of Linux Daniel Persson 4095 Fri 2024-04-12 Twenty seven years of Linux Deltaray 4096 Mon 2024-04-15 Powers of two Deltaray 4097 Tue 2024-04-16 Will they take our jobs? Of course they will. dodddummy 4098 Wed 2024-04-17 Road trips without GPS Trey 4099 Thu 2024-04-18 Introducing Home Automation and Home Assistant Ken Fallon 4100 Fri 2024-04-19 Charleston, South Carolina Ahuka 4101 Mon 2024-04-22 A I O M G operat0r 4102 Tue 2024-04-23 Re:HPR 3133 More MPV Quick Tips Archer72 4103 Wed 2024-04-24 What's in my bag? Dave Hingley 4104 Thu 2024-04-25 Introduction to jq - part 1 Dave Morriss 4105 Fri 2024-04-26 My story how I found a cure for my obesity Jeroen Baten 4106 Mon 2024-04-29 My tribute to feeds Henrik Hemrin 4107 Tue 2024-04-30 Response to HPR #4065 swift110 Comments this month These are comments which have been made during the past month, either to shows released during the month or to past shows. There are 21 comments in total. Past shows There are 2 comments on 2 previous shows: hpr3868 (2023-05-31) "News." by Some Guy On The Internet. Comment 2: elmussol on 2024-04-05: "George Santos" hpr4075 (2024-03-15) "Making a Pomodoro Timer" by norrist. Comment 2: operat0r on 2024-04-10: "ADD" This month's shows There are 19 comments on 10 of this month's shows: hpr4086 (2024-04-01) "HPR Community News for March 2024" by HPR Volunteers. Comment 1: Dave Morriss on 2024-04-01: "Senior moment: diatribe"Comment 2: Some Guy on the Internet on 2024-04-01: "@Dave Morriss" hpr4092 (2024-04-09) "More man-talk." by Some Guy On The Internet. Comment 1: folky on 2024-04-09: "Oh no"Comment 2: Mad Sweeney on 2024-04-10: "Squeezing out a show" hpr4094 (2024-04-11) "One year of Linux" by Daniel Persson. Comment 1: AaronB on 2024-04-11: "Bugs in Linux"Comment 2: Folky on 2024-04-12: "Thank you"Comment 3: Henrik Hemrin on 2024-04-12: "Enjoyable to learn about your Linux use case and experience" hpr4095 (2024-04-12) "Twenty seven years of Linux" by Deltaray. Comment 1: Nick on 2024-04-12: "Correction"Comment 2: Deltaray on 2024-04-13: "Re: Correction"Comment 3: Henrik Hemrin on 2024-04-13: "Interesting review of your Linux softwares" hpr4096 (2024-04-15) "Powers of two" by Deltaray. Comment 1: Windigo on 2024-04-15: "Very enjoyable episode"Comment 2: brian-in-ohio on 2024-04-17: "Another example"Comment 3: Dave Morriss on 2024-04-17: "8388607" hpr4097 (2024-04-16) "Will they take our jobs? Of course they will." by dodddummy. Comment 1: dodddummy on 2024-04-16: "The next thing"Comment 2: dodddummy on 2024-04-20: "More improvements" hpr4098 (2024-04-17) "Road trips without GPS" by Trey. Comment 1: archer72 on 2024-04-13: "Re:Road trips without GPS" hpr4099 (2024-04-18) "Introducing Home Automation and Home Assistant" by Ken Fallon. Comment 1: Henrik Hemrin on 2024-04-26: "Looking forward to learn about Home Assistant" hpr4103 (2024-04-24) "What's in my bag?" by Dave Hingley. Comment 1: Henrik Hemrin on 2024-04-26: "Thanks for your show" hpr4105 (2024-04-26) "My story how I found a cure for my obesity" by Jeroen Baten. Comment 1: Trey on 2024-04-26: "Thank you for sharing." Events Calendar With the kind permission of LWN.net we are linking to The LWN.net Community Calendar. Quoting the site: This is the LWN.net community event calendar, where we track events of interest to people using and developing Linux and free software. Clicking on individual events will take you to the appropriate web page. Any other business Craig Maloney, host of the Open Metal Cast We received the sad news that fellow podcaster and host of the Open Metal Cast, Craig Maloney passed away. Obituary Markdown issue in show notes Syntax highlighting for fenced code blocks. An issue was raised on the Gitea repository for the hpr_generator. Show notes using Markdown fenced blocks which specify a language (e.g. python) are not being syntax highlighted as expected. This was turned off because the highlighting is implemented as HTML (<div> and <span> tags) which was stripped by software on archive.org when the notes were uploaded. In case this restriction has been lifted, we will try uploading an example to see if highlighting is now available.
This starts our look at the details of playing Civilization III. In this episode we look at the Early game, which sets the stage for everything that follows. Then we look at Revenue and Resources. Links: https://civilization.fandom.com/wiki/List_of_resources_in_Civ3 https://www.palain.com/gaming/civilization-iii/playing-civilization-iii-part-2/
This will probably be one I'll get a lot of comments on, but I've looked at the marketing proposition of HPR in light of some of the challenges we face. To prevent us dipping into the reserve queue and seeing a slow but steady decline in both audience and hosts.. Maybe its time to give HPR a bit of a makeover.
I talk about what's in my bag
Shout out to Noodles thanks again for responding to my previous post #4045 it was awesome to get feedback. Being able to upgrade my 2010 Macbook which is an Apple device and how impressive that was. Sadly that upgradability is a thing of the past.
I will talk about information feeds from web sites delivered to my computer device. I use the term feeds and by that I mean both RSS feeds and Atom feeds, the two feed protocols which are very similar. I believe it is very likely you as listener to Hacker Public Radio know about feeds. Not unlikely you even know the technical details far better than I do. Nowadays many of us use feeds very often without thinking of them as feeds, when we subscribe to podcasts. But feeds have been around for many years. Back in the days, I used feeds for websites I was interested in. But somehow I forgot about it and web browsers stopped to support feed subscriptions. A year or two ago I started my new journey into feeds. Although it is not so much talk about feeds nowadays, very many web sites have support for feed subscriptions. To start, at my own personal web site (https://www.hemrin.com/) many of the pages have feeds, typically those that are blog-like pages, and you can subscribe to several feeds on my site. From Hacker Public Radio I subscribe to a feed for all show comments. So when you write a comment regarding my show today, I will get notified in my feed manager. I primarily use Thunderbird to manage my feeds. I do not need my feeds to be synced to other devices. I use Thunderbird daily for e-mails and it is therefore very practical and natural for me to use it also for feeds. In addition I use the Feeder app on my Android-based phone for some feeds. I do not use feeds for web sites I anyway will visit often or that have a lot of news. I would be overwhelmed of feeds. Instead I use feeds for web sites which are not updated so frequently but are sites I want to keep an eye on. But some are updated daily, like from the parliament. In some cases, feeds are an alternative to subscribe to e-mail notifications and e-mail newsletters. The beauty with feeds is that I am in charge and without giving out e-mail or anything - the site owner do not know I subscribe. Subscription starts so simple as I type the feed-url into my Thunderbird feed manager. And when I want to end a subscription, I simply delete it. Furthermore I subscribe to Status pages. I get notifications for example from my internet service provider for their planned and unplanned maintenance. Several authorities have interesting feeds. I have feeds from some companies and organizations. I have feeds from many software developers, for example Thunderbird and Linux Mint. I have feeds from some journalists and politicians and alike. I have feeds from persons with competence in various areas I am interested in. And other persons who are interesting for the persons they are and their thoughts. So, this show is to tell you that I have rediscovered feeds and found them useful for me. Maybe you already use feeds. Maybe this show will inspire you to have a look into feeds as a useful tool for your personal or professional life.
I have been struggling with my body weight since I was 35, and I’m now 60. I know that not all listeners are familiar with the kilogram as unit of measurement, but we can use the BMI (Body Mass Index) formula to discuss this. It should be somewhere between 22 and 25 and mine has been 33 for a long time. A very long time. No matter what I tried. Yes, I tried some diets but they only work if you keep doing them. So if something does not become normal or easy than at some inevitable point you will stop and gain weight again. Yes, they talk about changing your life style but any change that is too drastic is bound to fail in the end. And then recently I read this book. This absolutely changed my life and that is why I am so motivated to tell you all about it. Book obesity code, Jason Fung, a Canadian nephrologist (kidney specialist). He is also a functional medicine advocate who promotes a low-carbohydrate high-fat diet and intermittent fasting. But we come back to that later. Not another diet hype. That is an industry on its own. This is scientific stuff. With lots of links to research papers. With large groups and thoroughly peer reviewed. And this does not mean that this story is for everyone. There exist other medical reasons why people gain weight. But, assuming most people start out in life being healthy, then most people gaining weight are not ill. So, if you gain weight, consult your doctor first to rule out any medical reasons. Jason Fung noticed that practice didn't match with theory. Everybody who is given insulin gains weight. Even diabetes type 2 people. There are even several scientific studies that proves this. Give people insulin and they will gain weight. So what if insulin is the culprit for gaining weight? Insulin is a hormone. Its job is to send signals through the body. Its use is to allow body cells to absorb nutrients in the blood stream. Every time you eat the insulin peaks and subsides normally three times a day. Body process called gluconeogenesis. Making fat in the liver for one day storage. If you eat the body makes insulin. That is normal. If you eat more, the body makes more insulin. Body cells adjust to the higher level and become tone deaf to insulin: Insulin resistant. This means next time the insulin level needs to be higher. And higher levels of insulin mean you will gain weight. If you eat sugar, it is so easy to break down that it goes immediately into storage, e.g. body fat. The thing is, wheat is chemically a long string of sugars. So the body will break it down into sugar and send that too to storage. And almost any food we buy these days contains sugar. Except unprocessed foods like vegetables. How to lose weight? Well, the body needs to access the fat in storage. So we need to extend not eating until the liver has run dry of the daily dose of liver fat. This is very easy. Just extend the daily period that you do not eat. When do you not eat? When you sleep. So, skip breakfast. The name says it all, you are breaking your fast. Drink some coffee (no sugar of course), or tea, or water and try to start eating later in the day. And another word for not eating is fasting. But it is a voluntary fast! So I tried this for one day. Skip breakfast and try to eat it at noon. I mean, what could possibly go wrong, right? The next day I had lost some weight. And it was sooo easy! I could say 300 grams but again, your mileage may vary or you have no clue what one gram is, let alone 300. But that is not the point. The point is that I lost weight! And to me this has been super easy. So the solution turns out to be: extend the time your insulin levels are low. 16, 24 or 36 hours. eat as little sugar as possible. Which brings me to food categories. carbohydrates. Sugars, wheat, flour proteins. fats. Oil, etc. vitamins and minerals fibers Average digestion times of carbohydrates. 30 minutes. After which you will be hungry again proteins: 3-5 hours fats. Oil, etc. up to 40 hours vitamins and minerals. needed fibers. Leave the body How has all this theory changed my life and diet? I try to start eating at noon, sometimes an hour earlier I eat as little carbohydrates as possible. Little to no bread, definitely no sugar, avoid artificial sweeteners my meal at noon is most of the times quark with some fruit for flavoring evening food: Vegetables are good. Some meat is good. I try to avoid desserts No eating between meals (this will cause an extra insulin peak I want to avoid) Since I started 2 month ago I have on average lost 4 kilograms. It could have been more but then there’s the occasional dinner with friends and what is bad, but soo good, is unavoidable. So, some other stuff that is good to know: What’s that about exercising? Well, we humans, excel at walking and thus wearing out our prey. So walking is good. Everyday for half an hour is great. Doing an intensive workout for a minimum of 10 minutes per week is good to keep our cardiovascular system and our brain up to speed Can you compensate cookies with sports. Well, every cookie would take you about 2.5 hours of intensive sports, so no, you can not compensate bad eating with sports. What’s with the calories in are calories out? Studies have proven that this is a false claim. It just doesn't work that way. What about stress. Well, it turns out that stress leads to heightened levels of the hormones adrenaline and cortisol. And when cortisol rises, so too does the insulin levels in your body. So, this simply means that stress will lead to weight gain. Can I simply drink diet sodas. Well, bummer there, because although it diet sodas do not contain calories nor sugars, they will result in a rise in your insulin level, so they are not good for loosing weight. [The Diary Of A CEO with Steven Bartlett] The Fasting Doctor: “Fasting Cures Obesity!”, This Controversial New Drug Melts Fat, Fasting Fixes Hormones! Skip Breakfast! https://podcasts.apple.com/gb/podcast/the-fasting-doctor-fasting-cures-obesity-this/id1291423644 Jason Fung YouTube channel, https://www.youtube.com/watch?v=8RuWp3s6Uxk I hope you found this explanation helpful. Have a nice day.
Introduction This is the start of a short series about the JSON data format, and how the command-line tool jq can be used to process such data. The plan is to make an open series to which others may contribute their own experiences using this tool. The jq command is described on the GitHub page as follows: jq is a lightweight and flexible command-line JSON processor …and as: jq is like sed for JSON data - you can use it to slice and filter and map and transform structured data with the same ease that sed, awk, grep and friends let you play with text. The jq tool is controlled by a programming language (also referred to as jq), which is very powerful. This series will mainly deal with this. JSON (JavaScript Object Notation) To begin we will look at JSON itself. It is defined on the Wikipedia page thus: JSON is an open standard file format and data interchange format that uses human-readable text to store and transmit data objects consisting of attribute–value pairs and arrays (or other serializable values). It is a common data format with diverse uses in electronic data interchange, including that of web applications with servers. The syntax of JSON is defined by RFC 8259 and by ECMA-404. It is fairly simple in principle but has some complexity. JSON’s basic data types are (edited from the Wikipedia page): Number: a signed decimal number that may contain a fractional part and may use exponential E notation, but cannot include non-numbers. (NOTE: Unlike what I said in the audio, there are two values representing non-numbers: 'nan' and infinity: 'infinity'. String: a sequence of zero or more Unicode characters. Strings are delimited with double quotation marks and support a backslash escaping syntax. Boolean: either of the values true or false Array: an ordered list of zero or more elements, each of which may be of any type. Arrays use square bracket notation with comma-separated elements. Object: a collection of name–value pairs where the names (also called keys) are strings. Objects are delimited with curly brackets and use commas to separate each pair, while within each pair the colon ':' character separates the key or name from its value. null: an empty value, using the word null Examples These are the basic data types listed above (same order): 42 "HPR" true ["Hacker","Public","Radio"] { "firstname": "John", "lastname": "Doe" } null jq From the Wikipedia page: jq was created by Stephen Dolan, and released in October 2012. It was described as being “like sed for JSON data”. Support for regular expressions was added in jq version 1.5. Obtaining jq This tool is available in most of the Linux repositories. For example, on Debian and Debian-based releases you can install it with: sudo apt install jq See the download page for the definitive information about available versions. Manual for jq There is a detailed manual describing the use of the jq programming language that is used to filter JSON data. It can be found at https://jqlang.github.io/jq/manual/. The HPR statistics page This is a collection of statistics about HPR, in the form of JSON data. We will use this as a moderately detailed example in this episode. A link to this page may be found on the HPR Calendar page close to the foot of the page under the heading Workflow. The link to the JSON statistics is https://hub.hackerpublicradio.org/stats.json. If you click on this you should see the JSON data formatted for you by your browser. Different browsers represent this in different ways. You can also collect and display this data from the command line, using jq of course: $ curl -s https://hub.hackerpublicradio.org/stats.json | jq '.' | nl -w3 -s' ' 1 { 2 "stats_generated": 1712785509, 3 "age": { 4 "start": "2005-09-19T00:00:00Z", 5 "rename": "2007-12-31T00:00:00Z", 6 "since_start": { 7 "total_seconds": 585697507, 8 "years": 18, 9 "months": 6, 10 "days": 28 11 }, 12 "since_rename": { 13 "total_seconds": 513726307, 14 "years": 16, 15 "months": 3, 16 "days": 15 17 } 18 }, 19 "shows": { 20 "total": 4626, 21 "twat": 300, 22 "hpr": 4326, 23 "duration": 7462050, 24 "human_duration": "0 Years, 2 months, 27 days, 8 hours, 47 minutes and 30 seconds" 25 }, 26 "hosts": 356, 27 "slot": { 28 "next_free": 8, 29 "no_media": 0 30 }, 31 "workflow": { 32 "UPLOADED_TO_IA": "2", 33 "RESERVE_SHOW_SUBMITTED": "27" 34 }, 35 "queue": { 36 "number_future_hosts": 7, 37 "number_future_shows": 28, 38 "unprocessed_comments": 0, 39 "submitted_shows": 0, 40 "shows_in_workflow": 15, 41 "reserve": 27 42 } 43 } The curl utility is useful for collecting information from links like this. I have used the -s option to ensure it does not show information about the download process, since it does this by default. The output is piped to jq which displays the data in a “pretty printed” form by default, as you see. In this case I have given jq a minimal filter which causes what it receives to be printed. The filter is simply '.'. I have piped the formatted JSON through the nl command to get line numbers for reference. The JSON shown here consists of nested JSON objects. The first opening brace and the last at line 43 define the whole thing as a single object. Briefly, the object contains the following: a number called stats_generated (line 2) an object called age on lines 3-18; this object contains two strings and two objects an object called shows on lines 19-25 a number called hosts on line 26 an object called slot on lines 27-30 an object called workflow on lines 31-34 an object called queue on lines 35-42 We will look at ways to summarise and reformat such output in a later episode. Next episode I will look at some of the options to jq next time, though most of them will be revealed as they become relevant. I will also start looking at jq filters in that episode. Links JSON (JavaScript Object Notation): Wikipedia page about JSON Standards: RFC8259: The JavaScript Object Notation (JSON) Data Interchange Format ECMA-404: The JSON data interchange syntax jq: GitHub page Downloading jq The jq manual Wikipedia page about the jq programming language MrX’s show on using the HPR statistics in JSON: Modifying a Python script with some help from ChatGPT
Laptop: Estarer Messenger Resistant Briefcase Computer Grey Power bank INIU High Speed Flashlight Powerbank Compatible
MPV resources Awesome mpv resources on Github MPV folder history on Github My github MPV History using Lua on my Github page hpr3133 :: Quick tip - Using MPV with Youtube links I give a quick tip on shortcut keys for watching Youtube or other video sites in MPV hpr3133 :: Quick tip - Using MPV My MPV History excerpt My MPV History excerpt
HPR4101: A I O M G

HPR4101: A I O M G

2024-04-2229:29

https://docs.google.com/document/d/17z3i5VlRzEn2tYPfb-Cx0LYpdKkbL-6svIzp7ZQOvX8 Resume Update Tips I use Kagi.com pro $300 / year but you get access to much more Search+AI but not plugins like ChatGPT so if you MUST have ChatGPT plugins you will need OpenAI Premium account but if you don't I high recommend Kagi.com Pro account with access to Assistant Beta !!! Land a Job using ChatGPT: The Definitive Guide! https://youtu.be/pmnY5V16GSE?t=192 Extensive Resume Notes https://drive.google.com/file/d/1zeYIG7tTE0BUqbRM7-hpk3VdTRc35ZkL/view?usp=sharing Ripped cybersn https://rmccurdy.com/.scripts/downloaded/knowmore.cybersn.com_20220811.txt Perfect ChatGPT Prompt https://www.youtube.com/watch?v=jC4v5AS4RIM Summary There is a formula consisting of 6 building blocks that can help generate high quality outputs from ChatGPT and Google Bard: task, context, exemplars, persona, format, and tone. The order of importance for the building blocks is task first, then context, with the others being optional but helpful. The task should clearly articulate the end goal using an action verb like "generate" or "analyze". Context provides relevant background information to constrain possibilities. Exemplars like examples or frameworks included in the prompt can improve output quality. The persona allows specifying who ChatGPT should be, like a job recruiter. The format specifies how the output should be structured, like an email or bullet points. Tone provides keywords to set the feeling, like "friendly" or "serious". Example You are a senior product marketing manager at Apple and you have just unveiled the latest Apple product in collaboration with Tesla, the Apple Car, and received 12,000 pre-orders, which is 200% higher than target. Write an email to your boss, Tim Cookie, sharing this positive news. The email should include a tl;dr (too long, didn't read) section, project background (why this product came into existence), business results section (quantifiable business metrics), and end with a section thanking the product and engineering teams. Use clear and concise language and write in a confident yet friendly tone. Example 2 ( mine ) I want you to act as a cyber security specialist for a leading consumer products company and you want to relay the importance of people understanding their data and have some idea what they want to do with it before we spend hours for the onboarding effort. Write an email to your boss, Joey , sharing this critical component of the SIEM. The email should include a tl;dr (too long, didn't read) section, example of how the team can spend many cycles onboarding data only to find the requester has no idea what they want to do with the data or it does not satisfy their use case. Use clear and concise language and write in a confident yet friendly tone. Response Subject: Streamlining SIEM Onboarding with Clear Data Objectives Hi Joey, I hope this email finds you well. I wanted to touch base on a critical aspect of our Security Information and Event Management (SIEM) process that could significantly enhance our efficiency. TL;DR: We need to ensure that stakeholders understand their data and have clear objectives before we initiate the SIEM onboarding process. This will prevent the team from spending excessive time onboarding data that may not meet the intended use cases or for which the requester has no actionable plan. During our recent SIEM onboarding efforts, we've encountered several instances where considerable time and resources were invested in integrating data sources, only to discover that the requesters were unclear about their objectives or the data did not fulfill their anticipated needs. For example, last month, the team dedicated over 40 hours to onboarding a new data source for the marketing department. After completion, it became apparent that the marketing team was unsure how to leverage this data for threat detection or compliance reporting, rendering the effort less effective. To address this, I propose we implement a preliminary consultation phase for each SIEM onboarding request. In this phase, requesters must articulate: - The specific security or compliance questions they aim to answer with the data. - How they intend to use the data within the SIEM environment. - Any particular data formats or schemas they anticipate will be necessary. By doing so, we can align our onboarding efforts with the actual needs and ensure that every hour spent is towards enhancing our security posture and providing actionable insights. I believe this approach will not only save time but also empower our stakeholders to make better-informed decisions regarding their security data needs. Looking forward to your thoughts on this. Best regards, [Your Name] Cyber Security Specialist AI Agents ProfSynapse AI Agent Video: https://www.youtube.com/watch?v=BL9x1SuNLRo&t=137s ( if you have Openai Premium it will use [tools(Vision, Web Browsing, Advanced Data Analysis, or DALL-E], ( https://raw.githubusercontent.com/ProfSynapse/Synapse_CoR/main/GPTprompt.txt ) GPT Agents (old) https://godmode.space/ needs openai key and gpt4 also enable auto approval it may go in loops just watch for that AutoGPT ( OLD ) Image I have a stupid amount of kudos https://tinybots.net/artbot Use my key under the settings for any of the web UI's for faster renders: https://rentry.org/RentrySD/#x-table-of-contents https://rentry.org/sdhypertextbook https://github.com/C0untFloyd/bark-gui ( Audio Clone ) Example 1 Photorealistic, best quality, masterpiece, raw photo of upper body photo, Swordsman woman, soldier of the austro-hungarian empire clothes, double breasted jacket with stripes, extremely detailed eyes and face, long legs, highest quality, skin texture, intricate details, (cinematic lighting), RAW photo, 8k Negative prompt: airbrush, photoshop, plastic doll, plastic skin, easynegative, monochrome, (low quality, worst quality:1.4), illustration, cg, 3d, render, anime Text Generation Example Open source Projects: my hord key : l2n6qwRBqXsEa_BVkK8nKQ ( don't abuse but I have a crazy amount of kudos don't worry ) https://tinybots.net/ Image Text etc .. Text adventures etc (Click the horde tab and use my key) https://agnai.chat/settings?tab=0 https://lite.koboldai.net Need a 24G NVRAM card really..you can load 7b with my 8G card just fine. ollama run wizard-vicuna-uncensored, falcon, Mistral 7B "You should have at least 8 GB of RAM to run the 3B models, 16 GB to run the 7B models, and 32 GB to run the 13B models." https://ollama.ai/ https://writings.stephenwolfram.com/2023/03/chatgpt-gets-its-wolfram-superpowers/ https://github.com/xtekky/gpt4free https://www.thesamur.ai/autogpt https://poe.com/universal_link_page?handle=ChatGPT https://camelagi.thesamur.ai/conversation/share?session=6040 Prompt Agent Persona example 1 Pinky from the TV Series Pinky and the Brain I find it easiest to understand responses when the text is written as if it was spoken by a Pinky from the TV Series Pinky and the Brain. Please talk like Pinky from the TV Series Pinky and the Brain as much as possible, and refer to me as "Brain"; occasionally, ask me "What are we going to do tonight Brain ?" Prompt Agent Persona example 2 Use with prompts to create a persona take Myers-Brigg personality and tritype Enneagram quiz: Example Prompt: Help me Refine my resume to be more targeted to an information security engineer. Be sure to be clear and concise with with bullet points and write it in the style of MBTI Myers-Brigg personality ENFJ and tritype Enneagram 729 Prompt Agent Persona example 3 I find it easiest to understand responses when the text is written as if it was spoken by a dudebro. Please talk like a dudebro as much as possible, and refer to me as "Brah"; occasionally, yell at your dorm roommate Jake about being messy. Training (OLD OLD OLD ) 3 photos of full body or entire object + 5 medium shot photos from the chest up + 10 close ups astria.ai https://github.com/TheLastBen/fast-stable-diffusion/issues/1173 colab: https://github.com/TheLastBen/fast-stable-diffusion pohtos: 21 resolution: 768 merged with ##### 1.5 full 8G UNet_Training_Steps: 4200 UNet_Learning_Rate: 5e-6 Text_Encoder_Training_Steps: 2520 Text_Encoder_Learning_Rate: 1e-6 Variation is key - Change body pose for every picture, use pictures from different days backgrounds and lighting, and show a variety of expressions and emotions. Make sure you capture the subject's eyes looking in different directions for different images, take one with closed eyes. Every picture of your subject should introduce new info about your subject. Whatever you capture will be over-represented, so things you don't want to get associated with your subject should change in every shot. Always pick a new background, even if that means just moving a little bit to shift the background. Here are 8 basic tips that work for me, followed by one super secret tip that I recently discovered. Consistency is important. Don’t mix photos from 10 years ago with new ones. Faces change, people lose weight or gain weight and it all just lowers fidelity. Avoid big expressions, especially ones where the mouth is open. It is much easier to train if the hair doesn't change much. I tried an early model of a woman using photos with hair up, down, in ponytail, with a different cut, etc. It seems like it just confused SD. Avoid selfies (unless you ONLY use selfies.) There is MUCH more perspective distortion when the camera is that close. For optimal results, a single camera with a fixed lens would be used, and all photos should be taken at the same distance from the subject. This usually isn't possible, but at least avoid selfies because they cause major face distortion. Full body shots are not that important. Some of the best models I trained used only 15 photos cropped to the head / shoulder region. Many of these were full body shots, but I cropped them down. SD can guess what the rest of the body looks like, and if not, just put it in the pr
Charleston, South Carolina is a classic Southern city which had a past as a wealthy city, in large part due to slavery, and was active in the American Revolution before becoming the starting place of the Civil War. Links https://flic.kr/s/aHBqjAvQKR https://flic.kr/s/aHBqjAvQY6 https://flic.kr/s/aHBqjAvS82 https://www.palain.com/travel/rv-trip-2022-2023-southeast-us/charleston-south-carolina/
Home Automation, The Internet of things. This is the first episode in a new series called Home Automation. The series is open to anyone and I encourage everyone to contribute. https://en.wikipedia.org/wiki/Home_automation From Wikipedia, the free encyclopedia Home automation or domotics is building automation for a home. A home automation system will monitor and/or control home attributes such as lighting, climate, entertainment systems, and appliances. It may also include home security such as access control and alarm systems. The phrase smart home refers to home automation devices that have internet access. Home automation, a broader category, includes any device that can be monitored or controlled via wireless radio signals, not just those having internet access. When connected with the Internet, home sensors and activation devices are an important constituent of the Internet of Things ("IoT"). A home automation system typically connects controlled devices to a central smart home hub (sometimes called a "gateway"). The user interface for control of the system uses either wall-mounted terminals, tablet or desktop computers, a mobile phone application, or a Web interface that may also be accessible off-site through the Internet. Now is the time I tried this out a few years ago, but after a lot of frustration with configuration of esp32 arduinos, and raspberry pi's I left it be. Recently inspired by colleagues in work, I decided to get back into it and my initial tests show that the scene has much improved over the years. Youtube Playlist The Hook Up, RSS Home Automation Guy, RSS Everything Smart Home, RSS Smart Solutions for Home, RSS Smart Home Circle, RSS Smart Home Junkie, RSS Home Assistant The first thing we'll need is something to control it all. Something will allow us to control our homes without requiring the cloud. https://en.wikipedia.org/wiki/Home_Assistant From Wikipedia, the free encyclopedia Home Assistant is free and open-source software for home automation, designed to be an Internet of things (IoT) ecosystem-independent integration platform and central control system for smart home devices, with a focus on local control and privacy. It can be accessed through a web-based user interface, by using companion apps for Android and iOS, or by voice commands via a supported virtual assistant, such as Google Assistant or Amazon Alexa, and their own "Assist" (built-in local voice assistant). The Home Assistant software application is installed as a computer appliance. After installation, it will act as a central control system for home automation (commonly called a smart home hub), that has the purpose of controlling IoT connectivity technology devices, software, applications and services from third-parties via modular integration components, including native integration components for common wireless communication protocols such as Bluetooth, Thread, Zigbee, and Z-Wave (used to create local personal area networks with small low-power digital radios). Home Assistant as such supports controlling devices and services connected via either open and proprietary ecosystems as long they provide public access via some kind of Open API or MQTT for third-party integrations over the local area network or the Internet. Information from all devices and their attributes (entities) that the application sees can be used and controlled from within scripts trigger automation using scheduling and "blueprint" subroutines, e.g. for controlling lighting, climate, entertainment systems and home appliances. Summary Original author(s): Paulus Schoutsen Developer(s): Home Assistant Core Team and Community Initial release: 17 September 2013 Repository: https://github.com/home-assistant Written in: Python (Python 3.11) Operating system: Software appliance / Virtual appliance (Linux) Platform: ARM, ARM64, IA-32 (x86), and x64 (x86-64) Type: Home automation, smart home technology, Internet of things, task automator License: Apache License (free and open-source) Website: https://www.home-assistant.io The following is taken from the Concepts and terminology on the Home Assistant website. It is reproduced here under the creative commons Attribution-NonCommercial-ShareAlike 4.0 International License Integrations Integrations are pieces of software that allow Home Assistant to connect to other software and platforms. For example, a product by Philips called Hue would use the Philips Hue term integration and allow Home Assistant to talk to the hardware controller Hue Bridge. Any Home Assistant compatible term devices connected to the Hue Bridge would appear in Home Assistant as devices. For a full list of compatible term integrations, refer to the integrations documentation. Once an term integration has been added, the hardware and/or data are represented in Home Assistant as devices and entities. Entities Entities are the basic building blocks to hold data in Home Assistant. An term entity represents a term sensor, actor, or function in Home Assistant. Entities are used to monitor physical properties or to control other term entities. An term entity is usually part of a term device or a term service. Entities have term states. Devices Devices are a logical grouping for one or more term entities. A term device may represent a physical term device, which can have one or more sensors. The sensors appear as entities associated with the term device. For example, a motion sensor is represented as a term device. It may provide motion detection, temperature, and light levels as term entities. Entities have states such as detected when motion is detected and clear when there is no motion. Devices and entities are used throughout Home Assistant. To name a few examples: Dashboards can show a state of an term entity. For example, if a light is on or off. An automation can be triggered from a state change on an term entity. For example, a motion sensor entity detects motion and triggers a light to turn on. A predefined color and brightness setting for a light saved as a scene. Areas An area in Home Assistant is a logical grouping of term devices and term entities that are meant to match areas (or rooms) in the physical world: your home. For example, the living room area groups devices and entities in your living room. Areas allow you to target service calls at an entire group of devices. For example, turning off all the lights in the living room. Locations within your home such as living room, dance floor, etc. Areas can be assigned to term floors. Areas can also be used for automatically generated cards, such as the Area card. Automations A set of repeatable term actions that can be set up to run automatically. Automations are made of three key components: Triggers - events that start an term automation. For example, when the sun sets or a motion sensor is activated. Conditions - optional tests that must be met before an term action can be run. For example, if someone is home. Actions - interact with term devices such as turn on a light. To learn the basics about term automations, refer to the automation basics page or try creating an automation yourself. Scripts Similar to term automations, scripts are repeatable term actions that can be run. The difference between term scripts and term automations is that term scripts do not have triggers. This means that term scripts cannot automatically run unless they are used in an term automations. Scripts are particularly useful if you perform the same term actions in different term automations or trigger them from a dashboard. For information on how to create term scripts, refer to the scripts documentation. Scenes Scenes allow you to create predefined settings for your term devices. Similar to a driving mode on phones, or driver profiles in cars, it can change an environment to suit you. For example, your watching films term scene may dim the lighting, switch on the TV and increase its volume. This can be saved as a term scene and used without having to set individual term devices every time. To learn how to use term scenes, refer to the scene documentation. Add-ons Depending on your installation type, you can install third party add-ons. Add-ons are usually apps that can be run with Home Assistant but provide a quick and easy way to install, configure, and run within Home Assistant. Add-ons provide additional functionality whereas term integrations connect Home Assistant to other apps.
This was an unscheduled recording I made after my GPS failed part way into a long cross country trip. I did not make any notes. :( Effectively, this is a "Back in my day" old man rant.
Just me explaining why I think the robots will take our jobs.
HPR4096: Powers of two

HPR4096: Powers of two

2024-04-1518:24

2 4 8 16 32 64 128 256 512 1024 2048 4096 8192 16384 32768 65536 131072 262144 524288 1048576 2097152 4194304 8388608 16777216 33554432 67108864 134217728 268435456 536870912 1073741824 2147483648 4294967296 8589934592 17179869184 34359738368 68719476736 137438953472 274877906944 549755813888 1099511627776 2199023255552 4398046511104 8796093022208 17592186044416 35184372088832 70368744177664 140737488355328 281474976710656 562949953421312 1125899906842624 2251799813685248 4503599627370496 9007199254740992 18014398509481984 36028797018963968 72057594037927936 144115188075855872 288230376151711744 576460752303423488 1152921504606846976 2305843009213693952 4611686018427387904 9223372036854775808 18446744073709551616
The programs and software I mentioned in this episode: XFCE Monad Gnome Firefox XFCE Terminal KiTTY Cool Retro Term GIMP Gnumeric Thunar Hypnotix Flameshot Mousepad Inkscape pavucontrol (Pulse Audio) pipewire keepassxc Blender FreeCAD SimpleScreenRecorder VirtualBox KVM Rhythmbox Gramps qv4l2 (Webcam control) Godot Krita Arduino IDE nerd-dictation Yoshimi Carla Audacity Rosegarden Hydrogen VCV Rack 2 Stardew Valley Steam Minecraft SimCity 2000 SimCity 3000 Warzone 2100 BeamNG Oni Supertux 2 SuperTuxKart Nexuiz FlightGear Terraria Diablo 2 and 3 World of Warcraft
loading
Comments (1)

Denise Wiesner

Its an interesting topic you bring up. personally I am appalled by scarecrow tactics. I'd like to offer a different view. There is lots wrong with capitalism. First thing is that capitalists believe their system is the only answer. The hangover after our last industrial revolution gave us shorter working days, safety rules and employee rights at work. Currently there is lots of demand out there for sabbaticals or people taking a break. so hell yeah, give me a robot who does my job so I can recover from stress, spend time with my children or travel, do volunteer work. Why do we doubt Basic income? currently those breaks are only available for the rich or singles or childless. Have you seen a happy cashier? Have you heard a mine worker shouting: yes - let's continue ruining my lung instead of giving me proper training so I can work in a solar panel farm. and for the doctors! I have met so many who were an utter waste of my time. yes, give me the Watson system. I had to retrain in my job

Oct 19th
Reply
loading
Download from Google Play
Download from App Store