22 October, 2018

Linux Presentation Day!

Although... we actually don't host Linux Presentation Day here in Norway, I'd like to make people aware of this event however possible. So to the few who actually reads this blog, you're welcome ;) :P

https://linux-presentation-day.org/

13 September, 2018

Full-range monitor speakers


Klipsch RP-160M monitor speakers with Scandinavian-produced stands 👍 (NorStone).

I have never experienced monitors that kick you in the nuts with the woofers, but these definitely do! Both on hard drop-kicks and low frequency emissions (LFO).

Not to mention the middle-notes, which sound fantastic on these speakers. Having Klipsch' patented LTS (Low Travel Suspension) horns also helps to better define high-notes.

Although they are only 100W / 8Ω, they pack some seriously loud volumes, even without distortion!

Thumbs up! 👍👍👍

Panorama(-ish) picture of my rig(s):

10 August, 2018

Ubiquiti UniFi

Recently acquired a Ubiquiti UniFi AP-AC-PRO (1750 Mbps) wireless access-point, by recommendation from a professional friend. Telling me these things have ALL the pro-features of the more expensive APs around. And I have to admit, they... just work 👌 and very, very good I might add.


Even Linus Torvalds (yes, THAT Torvalds) uses these, because they are incredibly configurable for both simple and (very) advanced setups, and rock-solid in operation.


PoE+ (48V/24W) required though, as the USB-interface is misleading, it does not power the unit at all. Ubiquiti does not include a PoE+-injector with their APs, and does not specify this on the packaging either. But that was the only annoyance about them.


For the unit-price, you get a lot of features and functions compared to other "prosumer" choices.

And best of all: it runs Linux. Yes, this prosumer wifi-ap runs open-source software.


P.S. - 17th of August:

Acquired another AP for better coverage, as there are some rooms in the basement with sound isolation in the roofs. The signals got better, and reached through the intended rooms.

Also enabled mesh-networking (even though it was a beta-feature), and it got even better!

Ubiquiti FTW!

20 July, 2018

ZipStick®


1987 ZipStick® joystick. This yellow buttoned joystick uses micro-switches and has a triple fire action, and is THE best joystick I have ever used / abused. It can withstand practically ANYTHING!

Got hold of a near-mint copy (a few scratches on the housing and dirt beneath the screws), which tested OK and working on an Amiga A600 + Amiga A1200. SCORE!

Amiga-gaming will be a pleasure with this accessory! ;) :D

18 May, 2018

Switch

I was reading this article a couple of weeks ago, and sure was tempted in getting one...

Yes, I am a weak individual to g33k-marketing, I know this... 😅

So... I ended up shelling out the wet stinky, and it is on its way in the post 😋

Ubuntu on Nintendo Switch
Yes. Indeed. It will be used to do what its supposed main function is... But, I will also tinker and experiment with this gadget to my hearts content 😅 😎



Update Monday, May 28th:

Oh HELLZ YEAH!
Nintendo Switch Red/Blue JoyCons


Nintendo Switch + 8Bitdo NES30 Pro Bluetooth gamepad



02 May, 2018

Continuous Integration and Deployment

Continuous Integration is the practice of constantly merging development work with a Master/Trunk/Mainline branch so that you can test changes and test that those changes work with other changes. The idea here is to test your code as often as possible so you can catch issues early on. In the continuous integration process, most of the work is done by an automated tests technique which requires a unit test framework. It is best practice to have a build server designed specifically for performing these tests so your development team can continue merging requests even while tests are being performed...
Yes, automation here is key.
...Continuous Delivery is the continual delivery of code to an environment once the developer feels the code is ready to ship - this could be UAT (User Acceptance Testing), staging or production. The idea behind continuous delivery is that you’re constantly delivering code to a user base, whether it be QA or directly to customers for continual review and inspection. Although similar to continuous integration, continuous delivery differs because it can feed business logic tests where unit tests are unable to catch all business logic, particularly design issues.

...Continuous Deployment is the deployment or release of code to production as soon as it’s ready. There is no large batching in staging nor a long UAT (User Acceptance Testing) process before production. Any testing is done prior to merging to the Mainline branch and is performed on production-like environments. The production branch is always stable and ready to be deployed by an automated process. The automated process is key because it should be able to be performed by anyone in a matter of minutes (preferably by the press of a button).
And after all that, log-auditing after deployment; checking key metrics if they are influenced negatively or positively by change(s).

In the ideal workflow, the entire process could be automated from start to finish:

  • Step 1: Developer checks in code to development branch.
  • Step 2: Continuous integration server picks up the change, merges it with Master/Trunk/Mainline, performs unit tests and votes on the merge to staging environment based on test results.
  • Step 3. If Step 2 is successful, developer deploys it to the staging environment and QA tests the environment.
  • Step 4. If Step 3 passed, you vote to move to production and the continuous integration server picks this up again and determines if it’s ok to merge into production.
  • Step 5. If Step 4 is successful, it will deploy to production environment. 

This process varies slightly based on needs, requirements and approaches.

24 April, 2018

Need for security-professionals in Norway

Yes, it's been an often-discussed topic in Norwegian media in later years:

"Lack of security-professionals."

Well, as commented in this (Norwegian) article, BY a security-professional; there seems to be a lack of security-oriented IT professionals, but, not because they aren't there at all. They are. What is seriously lacking in this scenario, is competence in recruiting firms looking for this kind of competence. Always has been.

Computer-security is not a fixed-set field, AT ALL. Even though a lot of so-called "professionals" seem to be stuck on the idea that it is.

Serious professionals wanting to work in this field on the other hand, are (often) painfully aware of what it actually entails to do so:

  • constant refreshing on networking- / computing- / vulnerability-security in IT
  • vulnerability-monitoring of often-used software in the company
  • a simple awareness of the fact that: nobody is ever 100% secure
Computer-security is a weight-battle; does the securing of something vulnerable affect normal operations? Or, is the fix / security-measure absolutely needed for normal operations? These are everyday obstacles a security-professional has to deal with on a regular basis, so they have to be quite flexible on expanding their knowledge-base, and often.

These points are often completely missed by recruiters. They don't look for ability / knowledge / flexibility, they often tend to only look at academic degrees (preferably multiple(!)), gender, published articles / blog-posts and other non-related (and often quite unrealistic) demands for the position(s) in question.

Then, they complain about not finding any candidates for their outrageous requirements.

Seriously, re-define your demands / requirements to a more realistic degree, maybe you'll find a competent person to do the job. But you most certainly will NOT find the dream-candidate with the kind of demands currently set as standard.

17 April, 2018

when PIGS FLY!!

"After 43 years, this is the first day that we are announcing, and will be distributing, a custom Linux kernel," Microsoft President Brad Smith said
http://www.businessinsider.com/microsoft-azure-sphere-is-powered-by-linux-2018-4?r=US&IR=T&IR=T

Yeah, well, OSS / Linux won...

18 March, 2018

8Bitdo NES30 Pro


8Bitdo is a company specializing in custom retro game-controllers (gamepads). Their motto is: "Everything Old Is New Again".


Recently I picked up a pair of NES30 Pro editions. Not too expensive either for what you get. A pro-grade game controller (ergo: can handle some abuse) made for use with practically ANY retro (and even today's) game-systems!


Compatible with: Linux (desktop+RPi3), Mac OS X, Windows, Nintendo Switch, Android and iOS!


Everything I've tested them on functions as advertised, and even some that weren't listed! Easily paired through bluetooth or connected with USB 2.0.

Best thing: its firmware can even be upgraded through desktop-Linux!

15 February, 2018

Meltdown/Spectre + BSD


https://malcont.net/2018/01/dont-like-meltdown-spectre-releated-bugs-handled/

"Serverless Architecture"


Serverless computing refers to the concept of building and running applications that do not require server management. It describes a finer-grained deployment model where applications, bundled as one or more functions, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment.
http://www.zdnet.com/article/servers-we-dont-need-no-stinkin-servers/

If you are an administrator, serverless architecture may be something to look into ASAP, as well as Functions-as-a-Service (FaaS) ;)

12 February, 2018

DevOps + Development

DevOps is not easy.

As software transitions from a monolithic to a microservice architecture, organizations are adopting DevOps practices to accelerate delivery of features to customers and improve their experience.

Jumping into continuous testing without the right infrastructure, tools, and processes can be a disaster.

Continuous testing plays an important role to achieve the fastest quality to market. Continuous testing requires several levels of monitoring with automated triggers, collaboration, and actions. Here’s what is required:

  • Automatic Test Triggers to execute tests as software transitions from various stages – development / test / staging / production
  • Service Health Monitoring to automate feedback on failures
  • Test Result Monitoring to automate feedback on failures
  • Identifying Root Cause of Failure and analyzing test results
As one can imagine this takes a hell of a toll on DevOps-personnel.

It is one of the most challenging fields today. Simply because it requires a deep understanding of the right principles, processes and practices that the DevOps philosophy is bringing to the IT world.

Because that is what it is: a philosophy.

05 January, 2018

Meltdown & Spectre --update--

Yeah, sometimes it does not warrant any extra security to be cutting edge... This I know.

That a hardware-vulnerability has gone unchecked for a couple of decades, however, eluded even me. Even more that it wasn't even addressed / announced before very recently.

Turns out, almost every computing-device I own has these bugs. And I find myself in a situation where I do as very many others do with vulnerable equipment, with little to no chance of patching; I just isolate them.

Don't get me wrong, I've taken measures and patched / disabled low-level functions as best I could. But when the issue is basically invisible (ring -3), there's limits to what I can do to fix it.

The ass-hats who made the shit have to fix it properly, or someone considerably smarter than me have to do what they can to mitigate as the circumstances will allow.

Which, from what I understand isn't much, and it's massively complicated to boot. The complications are the reasons for the "considerable performance slow-down" that will result from the software-fixes to the issue.


***UPDATE***

Seems these bugs / vulnerabilities have been blown totally out of proportion for the average computer-user.

Slowdowns only present themselves at huge workloads (think Big Data databases, enterprise computing, etc.), so average-Joe won't even notice any difference... I've been pretty buzy patching / fixing my affected systems lately, both at work and at home, and I can't say I've noticed any significant slowdowns in any way.

Not that I've got huge workloads, or global-spanning database-queries running 24/7, but I've definitely got bigger and heavier workloads than the average man.

--- If people just patch their systems regularly, they'll be fine ---




If you want to be sure you actually have the bugs, you can run this bash-script on Linux systems:
#!/usr/bin/env bash
echo "cpuinfo           : "$(cat /proc/cpuinfo | grep 'model name' | awk 'NR==1' | awk '{ print $4" "$5" "$6" "$7" "$8" "$9 }');
cat /proc/cpuinfo | grep -i bugs | head -1;


02 November, 2017

Desktop-Linux + CLI?

"If there's one thing surrounding Linux usage that bothers me more than anything else, it's when the detractors say you cannot work with Linux without knowing the command line. This is a bit of FUD — fear, uncertainty, and doubt — that keeps new users from giving the open source platform a try. I'm here, right now, to dispel that myth."
Ubuntu Desktop
Linux Mint (MATE edition)

With personal experience, I can attest to this statement. The days when you HAD to deal with the Command Line even on desktop-Linux distros are past. Only server-distros demand this nowadays.


https://www.techrepublic.com/article/yes-you-can-use-linux-without-knowing-the-command-line/

01 November, 2017

To my doubters

Yes, seems I finally landed my dream-job =D It's not work, it's fun :)

I get to work with innovating and bleeding-edge technology every day, at my own terms, with all the tech-benefits I could wish for. Not to mention my contract-benefits, and office-benefits (sponsored gourmet coffee, UV-filtered water and Red Bull).

"Red Bull-fridge"


Working with as smart people as I do is, well, very rewarding in itself. Finally, people at my own level! People who actually appreciates open-source and Linux!

To all my doubters and nay-sayers:
how's that Microsoft-programming job working out for you?

26 October, 2017

Cloud Engineer + DevOps

The Cloud

A cloud engineer is an IT professional responsible for any technological duties associated with cloud computing, including design, planning, management, maintenance and support.

The cloud engineer position can be broken into multiple roles, including cloud architect, cloud software engineer, cloud security engineer, cloud systems engineer and cloud network engineer.

Each position focuses on a specific type of cloud computing, rather than the technology as a whole. Companies that hire cloud engineers are often looking to deploy cloud services or further their cloud understanding and technology.

https://aws.amazon.com/what-is-cloud-computing/



DevOps

DevOps is the combination of cultural philosophies, practices, and tools that increases an organization’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organizations using traditional software development and infrastructure management processes. This speed enables organizations to better serve their customers and compete more effectively in the market.



From my employer's website [edit/red.]:
 Robots and artificial intelligence create opportunities of enormous dimensions. 

 We have developed the market-leading virtual assistant, who has taken the European market by storm. The company is experiencing rapid growth and we are looking for new colleagues to join us at our office in Stavanger, Norway. Become a part of what the World Economic Forum calls "The Fourth Industrial Revolution"! 

 As an employee in our company, you become a part of a young and dynamic environment with a high degree of freedom for self-development. Your colleagues have exceptional expertise in data and technology, and all developers get access to state of the art equipment. 

 Our technology stack today consists of Linux, PostgreSQL, Apache, Varnish, Java, Spring, Grails, Groovy, Gradle, Python, Javascript, Lua, Torch, IntelliJ IDEA, Git and Amazon Web Services (hence the aws-amazon links above). We will use any required additional technologies in the future to solve whatever new challenges may arise. 

25 October, 2017

In 2017, Linux rules computing

 The Linux Foundation reports that Linux runs 90 percent of the public cloud workload, 82 percent of the world's smartphones, 62 percent of the embedded market, oh and a mere 99 percent of the supercomputer market. All that rests on the Linux kernel.

"The Linux kernel is one of the largest and most successful open-source projects that has ever come about. The huge rate of change and number of individual contributors show that it has a vibrant and active community, constantly causing the evolution of the kernel in response to number of different environments it is used in. This rate of change continues to increase, as does the number of developers and companies involved in the process; thus far, the development process has proved that it is able to scale up to higher speeds without trouble."

http://www.zdnet.com/article/whos-building-linux-in-2017/
 Good news :)

Over the past year, the kernel has been updated through merged changesets, new drivers, hardening and testing.

http://sdtimes.com/report-interest-linux-kernel-remains-strong/
http://www.zdnet.com/article/microsoft-says-40-percent-of-all-vms-in-azure-now-are-running-linux/

19 October, 2017

Green / Blue - Blue / Green

:P WAT? Green/Blue?

Blue/Green Deployment in Datacenter

How it Works

In the "blue-green" model, two identical application stacks are maintained (for convenience, I will refer to our application server instances and service instances as a stack, database deployments are done separately . One of the stacks is always live, let us say green is live and blue is stand-by.

When we want to deploy a new build to production, we deploy it on blue stack and do thorough testing. Once new build seems to work fine, the blue stack goes live and green becomes standby. This helps us do a quick rollback if issues are identified during deployment or sanity check.

What are the challenges?

The tricky bit here is to switch between blue and green stacks with almost no downtime.
An overhead here is to maintain two stacks and hence adds to the cost.

https://medium.com/devops-learnings/blue-green-deployments-in-data-center-e704f93f9f70
My new job requires a lot of automation set in place for us in DevOps to be able to manage the huge deployment-workloads we get. I am currently in the process of automating our setup to a degree where we only have to initiate deployment-processes without having to monitor them as they execute :P at least compared to before I came in

AUTOMATE ALL THE THINGS!

27 September, 2017

AMiGA fanboy; guilty.

I just got an awaited shipment from the UK:

Amiga "Rainbow Double-Check"-logo T-shirt

Workbench | 'Insert diskette' screen T-shirt

Representin' ;D

After re-playing a heap of my all-time favorite Amiga-games on my RetroPie (RPi3), I thought I'd do a little post honoring the 80s/90s gaming-computer. Mainly, the Amiga 500.

No other computer-system has ever had this big an impact on my life, my preferences and my esthetic sense.

To give a little intro about the Amiga 500: it was launched as a compact, home game-computer in the mid-1980s. The Amiga 500 had everything from motherboard/CPU to add-on-chips and keyboard integrated into one plastic casing.

People mostly hooked it up to their TV with an extra peripheral called the RF-modulator, but was also sold with an Commodore-branded CRT-monitor (see picture below), that even sported integrated stereo speakers(!).

Commodore Amiga 500 - 'System 1'

At the time (late 80s) Amiga was unmatched in animation and sound, mostly due to their use of specialty chain-chips (separate chips co-working in a process-chain), but also thanks to a massive (for the time) homebrew-scene (the demo-scene).

The Amiga 500 demonstrated graphics-powers unmatched by similar 8-/16-bit systems at the time. As the Amiga 500 had over 300.000 units sold by 1990, it had established itself as a massively popular home-computer across Europe. But due to scepticism and poor advertising, not so good in the US.

Amiga set itself apart from other computer systems, just like Apple. For example, the PC-dominating three-finger salute, is performed differently on Amiga-systems: Ctrl + Left Amiga + Right Amiga.

It also sported co-processors (additional processors other than the CPU) for various tasks that made this computer way ahead of its time. Today's computers--for example--almost always include co-processors (like GPUs), much like the Amiga series did back in the 80s and 90s. Only difference being that the co-processors in the Amigas were DIP-chips soldered directly to the actual motherboard, whereas modern GPUs are individually contained system-cards, sporting their own FPU-chips+G-RAM that just slot into modern motherboard PCI-express slots and uses the motherboard north/south bridge-connections to communicate with the CPU(s) / RAM.

The 90s brought the demo-scene community that grew out of the 80s, eagerly showcasing the Amiga as a competitive computer-system by making byte-sized (68000 assembly-coded), elaborate animated/scored demonstration-applications, coined "demos", thus; the demo-scene.



One of my most memorable demos was this "cracktro" by Unit A:


Another more recent (2010) demo by Razor1911:




The Amiga was also one of the last, true ROM-based computers (the operating system was located on a ROM-chip on-board, much like our Android/iOS-smartphones today, and other micro-computers of the late 80s (Apple), hard-drives cost an arm and a leg in those days, very expensive) that sported accompanying diskette(s) containing the Workbench graphical desktop environment by itself. Original, to say the least.

The Amiga Workbench was a multi-tasking desktop environment. Shown running a background-application (Boing) behind the Workbench-UI.

Commodore Amiga Workbench 1.3 (w/"Boing" running in back)
It also loaded and ran games straight from 3½ inch floppy disks (usually 880 KB format). The game-data was loaded from the floppy into Amiga-memory and consequently executed.

The picture below shows the Amiga-screen that shows if you didn't load Workbench from floppy, indicating you had to load a floppy for something to happen.

Commodore Amiga 'Insert diskette' load-screen
Commodore Amiga Boing Ball, animated gif representation
--------
It was a major feat when made in the 80s. One animation done in bit-plane, one sound-sample run at varying speeds and sample-rates. All-together it made an animated scene with a bouncing ball in a pseudo-3d grid-room.

Commodore Amiga Guru Meditation
The Amiga equivalent of a BSOD on Windows and kernel panic on Linux

The only real caveats in these series of computers were the diskette-drive(s) and the power supply unit (PSU).

Commodore Amiga external Power Supply Unit



As a curiosity, here is a rather rare picture from wikipedia (https://en.wikipedia.org/wiki/Amiga_Unix):

Amiga Unix - System V Release 4 Amiga Version 2.0
Boot / Root installation-diskettes & tape



As a treat, here is the best Amiga-videos I've found on YouTube:



Turrican II: The Final Fight




Amiga Story | Part 1 + Part 2






25 September, 2017

Bleeding-edge Android

Yes, I'm slow at adopting new tech nowadays. Finally got updated to Android v7.x/8.x. Got my step-brothers' old Nexus 5X, and yes; I know this particular model has hardware-issues.

This one however has been fixed. Faulty components have been replaced and a new warranty issued.



I didn't really worry about using an outdated phone (Nexus 5), since I had secured it as much as I would "be allowed". Yes, allowed. Our corporate overlords control more than you might think...

29 August, 2017

Happy Birthday Linux!

Happy Birthday Linux! Over a quarter (26yrs) of a century old, still going strong and conquering area after area in various technology-segments 😋😊😃

Yes, I know he didn't announce it on newsgroups this day, he did on August 25th.

But, as he says himself, he considers both dates as birthdays. The first as the announcement-date, and the second (today, August 29th) as the upload-date.

15 June, 2017

Back 2 School

Looks like I'll be a student again this fall 😉 "precourse in maths and physics for engineering" 😏 after that, probably a BSc/MSc in Computer Science.

Yes.., I didn't finish any grade when I was attending private college earlier, it didn't work out, so I started working instead 😎

Now, I'm attending the University of Stavanger (UiS).

06 June, 2017

Retro-gaming

I've been itching to write this blogpost for a while...

I'm an avid retro-gamer, as well as a contemporary gamer. My emulation-antics took off in the late 90s when I started getting nostalgic about old DOS-games from the late 80s and early 90s. For the most part, DOS-emulation was pretty accurate even in the early days.

But these days emulation is pretty much a native thing. CPU-cycle-imitation and other emulation-techniques have pretty much reached the runtime-levels of the actual systems they are emulating.

I don't favor the new idea of releasing limited special-editions of consoles, like the NES classic mini and the SNES classic mini, when you can build a COTS-computer (or even use a deprecated laptop/desktop), load it up with RetroPie and ROMs, and you basically have a multi-emulation box that runs EVERYTHING, and can be CUSTOMIZED.

If you go for a Raspberry Pi 3, such a system could even cost you as low as $30 (apart from cables and gamepads / arcade-sticks, then it would cost you a minimum of $50).

Sure, a few people argue that the RPi3 is a fad, or that the SD-card gets worn out so it's not made to last, etc.

Well, a $30 credit-card computer isn't really that hard to replace, and SD-cards are a dime a dozen these days and are even getting cheaper, not to mention quite easy to back up (making a copy-image on your computer harddrive / usb-drive).

Sure you cannot for example use exotic console-hardware, like the microphone gamepad for the Japanese NES. Nor can you play roms made from FX-gamepaks (Nintendo addon-technology to render pseudo-3D on the SNES). Then again, would you want to? Hey, if that's your thing, have at it. I don't give a flying f**k...

Emulation is king - imho.