15 February, 2018
"Serverless Architecture"
Serverless computing refers to the concept of building and running applications that do not require server management. It describes a finer-grained deployment model where applications, bundled as one or more functions, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment.http://www.zdnet.com/article/servers-we-dont-need-no-stinkin-servers/
If you are an administrator, serverless architecture may be something to look into ASAP, as well as Functions-as-a-Service (FaaS) ;)
12 February, 2018
DevOps + Development
As software transitions from a monolithic to a microservice architecture, organizations are adopting DevOps practices to accelerate delivery of features to customers and improve their experience.
Jumping into continuous testing without the right infrastructure, tools, and processes can be a disaster.
- Automatic Test Triggers to execute tests as software transitions from various stages – development / test / staging / production
- Service Health Monitoring to automate feedback on failures
- Test Result Monitoring to automate feedback on failures
- Identifying Root Cause of Failure and analyzing test results
05 January, 2018
Meltdown & Spectre --update--
Yeah, sometimes it does not warrant any extra security to be cutting edge... This I know.
That a hardware-vulnerability has gone unchecked for a couple of decades, however, eluded even me. Even more that it wasn't even addressed / announced before very recently.
Turns out, almost every computing-device I own has these bugs. And I find myself in a situation where I do as very many others do with vulnerable equipment, with little to no chance of patching; I just isolate them.
Don't get me wrong, I've taken measures and patched / disabled low-level functions as best I could. But when the issue is basically invisible (ring -3), there's limits to what I can do to fix it.
The ass-hats who made the shit have to fix it properly, or someone considerably smarter than me have to do what they can to mitigate as the circumstances will allow.
Which, from what I understand isn't much, and it's massively complicated to boot. The complications are the reasons for the "considerable performance slow-down" that will result from the software-fixes to the issue.
***UPDATE***
--- If people just patch their systems regularly, they'll be fine ---
If you want to be sure you actually have the bugs, you can run this bash-script on Linux systems:
#!/usr/bin/env bash
echo "cpuinfo : "$(cat /proc/cpuinfo | grep 'model name' | awk 'NR==1' | awk '{ print $4" "$5" "$6" "$7" "$8" "$9 }');
cat /proc/cpuinfo | grep -i bugs | head -1;
15 November, 2017
All your supercomputers are belong to us
It was a matter of when, not if...
http://www.zdnet.com/article/linux-totally-dominates-supercomputers/
Norwegian article: https://www.digi.no/artikler/na-kjorer-alle-verdens-supermaskiner-pa-topp-500-listen-samme-operativsystem-dette-er-totalt-dominas/412214
02 November, 2017
Desktop-Linux + CLI?
![]() |
Ubuntu Desktop |
![]() |
Linux Mint (MATE edition) |
With personal experience, I can attest to this statement. The days when you HAD to deal with the Command Line even on desktop-Linux distros are past. Only server-distros demand this nowadays.
https://www.techrepublic.com/article/yes-you-can-use-linux-without-knowing-the-command-line/
01 November, 2017
To my doubters
Yes, seems I finally landed my dream-job =D It's not work, it's fun :)
I get to work with innovating and bleeding-edge technology every day, at my own terms, with all the tech-benefits I could wish for. Not to mention my contract-benefits, and office-benefits (sponsored gourmet coffee, UV-filtered water and Red Bull).
![]() |
"Red Bull-fridge" |
Working with as smart people as I do is, well, very rewarding in itself. Finally, people at my own level! People who actually appreciates open-source and Linux!
To all my doubters and nay-sayers:
how's that Microsoft-programming job working out for you?
26 October, 2017
Cloud Engineer + DevOps
The cloud engineer position can be broken into multiple roles, including cloud architect, cloud software engineer, cloud security engineer, cloud systems engineer and cloud network engineer.
Each position focuses on a specific type of cloud computing, rather than the technology as a whole. Companies that hire cloud engineers are often looking to deploy cloud services or further their cloud understanding and technology.
https://aws.amazon.com/what-is-cloud-computing/
DevOps is the combination of cultural philosophies, practices, and tools that increases an organization’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organizations using traditional software development and infrastructure management processes. This speed enables organizations to better serve their customers and compete more effectively in the market.
From my employer's website [edit/red.]:
Robots and artificial intelligence create opportunities of enormous dimensions.
We have developed the market-leading virtual assistant, who has taken the European market by storm. The company is experiencing rapid growth and we are looking for new colleagues to join us at our office in Stavanger, Norway. Become a part of what the World Economic Forum calls "The Fourth Industrial Revolution"!
As an employee in our company, you become a part of a young and dynamic environment with a high degree of freedom for self-development. Your colleagues have exceptional expertise in data and technology, and all developers get access to state of the art equipment.
Our technology stack today consists of Linux, PostgreSQL, Apache, Varnish, Java, Spring, Grails, Groovy, Gradle, Python, Javascript, Lua, Torch, IntelliJ IDEA, Git and Amazon Web Services (hence the aws-amazon links above). We will use any required additional technologies in the future to solve whatever new challenges may arise.
25 October, 2017
In 2017, Linux rules computing
The Linux Foundation reports that Linux runs 90 percent of the public cloud workload, 82 percent of the world's smartphones, 62 percent of the embedded market, oh and a mere 99 percent of the supercomputer market. All that rests on the Linux kernel.Good news :)
"The Linux kernel is one of the largest and most successful open-source projects that has ever come about. The huge rate of change and number of individual contributors show that it has a vibrant and active community, constantly causing the evolution of the kernel in response to number of different environments it is used in. This rate of change continues to increase, as does the number of developers and companies involved in the process; thus far, the development process has proved that it is able to scale up to higher speeds without trouble."
http://www.zdnet.com/article/whos-building-linux-in-2017/
Over the past year, the kernel has been updated through merged changesets, new drivers, hardening and testing.http://www.zdnet.com/article/microsoft-says-40-percent-of-all-vms-in-azure-now-are-running-linux/
http://sdtimes.com/report-interest-linux-kernel-remains-strong/
19 October, 2017
Green / Blue - Blue / Green
:P WAT? Green/Blue?
My new job requires a lot of automation set in place for us in DevOps to be able to manage the huge deployment-workloads we get. I am currently in the process of automating our setup to a degree where we only have to initiate deployment-processes without having to monitor them as they execute :P at least compared to before I came inBlue/Green Deployment in Datacenter
How it Works
In the "blue-green" model, two identical application stacks are maintained (for convenience, I will refer to our application server instances and service instances as a stack, database deployments are done separately . One of the stacks is always live, let us say green is live and blue is stand-by.
When we want to deploy a new build to production, we deploy it on blue stack and do thorough testing. Once new build seems to work fine, the blue stack goes live and green becomes standby. This helps us do a quick rollback if issues are identified during deployment or sanity check.
What are the challenges?
The tricky bit here is to switch between blue and green stacks with almost no downtime.
An overhead here is to maintain two stacks and hence adds to the cost.
https://medium.com/devops-learnings/blue-green-deployments-in-data-center-e704f93f9f70
AUTOMATE ALL THE THINGS!
27 September, 2017
AMiGA fanboy; guilty.
I just got an awaited shipment from the UK:
![]() |
Amiga "Rainbow Double-Check"-logo T-shirt |
![]() |
Workbench | 'Insert diskette' screen T-shirt |
Representin' ;D
After re-playing a heap of my all-time favorite Amiga-games on my RetroPie (RPi3), I thought I'd do a little post honoring the 80s/90s gaming-computer. Mainly, the Amiga 500.
No other computer-system has ever had this big an impact on my life, my preferences and my esthetic sense.
To give a little intro about the Amiga 500: it was launched as a compact, home game-computer in the mid-1980s. The Amiga 500 had everything from motherboard/CPU to add-on-chips and keyboard integrated into one plastic casing.
People mostly hooked it up to their TV with an extra peripheral called the RF-modulator, but was also sold with an Commodore-branded CRT-monitor (see picture below), that even sported integrated stereo speakers(!).
![]() |
Commodore Amiga 500 - 'System 1' |
At the time (late 80s) Amiga was unmatched in animation and sound, mostly due to their use of specialty chain-chips (separate chips co-working in a process-chain), but also thanks to a massive (for the time) homebrew-scene (the demo-scene).
The Amiga 500 demonstrated graphics-powers unmatched by similar 8-/16-bit systems at the time. As the Amiga 500 had over 300.000 units sold by 1990, it had established itself as a massively popular home-computer across Europe. But due to scepticism and poor advertising, not so good in the US.
Amiga set itself apart from other computer systems, just like Apple. For example, the PC-dominating three-finger salute, is performed differently on Amiga-systems: Ctrl + Left Amiga + Right Amiga.
It also sported co-processors (additional processors other than the CPU) for various tasks that made this computer way ahead of its time. Today's computers--for example--almost always include co-processors (like GPUs), much like the Amiga series did back in the 80s and 90s. Only difference being that the co-processors in the Amigas were DIP-chips soldered directly to the actual motherboard, whereas modern GPUs are individually contained system-cards, sporting their own FPU-chips+G-RAM that just slot into modern motherboard PCI-express slots and uses the motherboard north/south bridge-connections to communicate with the CPU(s) / RAM.
The 90s brought the demo-scene community that grew out of the 80s, eagerly showcasing the Amiga as a competitive computer-system by making byte-sized (68000 assembly-coded), elaborate animated/scored demonstration-applications, coined "demos", thus; the demo-scene.
The Amiga was also one of the last, true ROM-based computers (the operating system was located on a ROM-chip on-board, much like our Android/iOS-smartphones today, and other micro-computers of the late 80s (Apple), hard-drives cost an arm and a leg in those days, very expensive) that sported accompanying diskette(s) containing the Workbench graphical desktop environment by itself. Original, to say the least.
The Amiga Workbench was a multi-tasking desktop environment. Shown running a background-application (Boing) behind the Workbench-UI.
![]() |
Commodore Amiga Workbench 1.3 (w/"Boing" running in back) |
The picture below shows the Amiga-screen that shows if you didn't load Workbench from floppy, indicating you had to load a floppy for something to happen.
![]() |
Commodore Amiga 'Insert diskette' load-screen |
![]() |
Commodore Amiga Guru Meditation The Amiga equivalent of a BSOD on Windows and kernel panic on Linux |
The only real caveats in these series of computers were the diskette-drive(s) and the power supply unit (PSU).
![]() |
Commodore Amiga external Power Supply Unit |
As a curiosity, here is a rather rare picture from wikipedia (https://en.wikipedia.org/wiki/Amiga_Unix):
![]() |
Amiga Unix - System V Release 4 Amiga Version 2.0 Boot / Root installation-diskettes & tape |
As a treat, here is the best Amiga-videos I've found on YouTube:
- Firstly: my all-time favorite game
- Second: the Amiga Story told by The Nostalgia Nerd:
Turrican II: The Final Fight
Amiga Story | Part 1 + Part 2
- http://www.amiga.org/
- http://www.amigaworld.net/
- http://www.aminet.net/
- http://amigakit.leamancomputing.com/catalog/
25 September, 2017
Bleeding-edge Android
This one however has been fixed. Faulty components have been replaced and a new warranty issued.
I didn't really worry about using an outdated phone (Nexus 5), since I had secured it as much as I would "be allowed". Yes, allowed. Our corporate overlords control more than you might think...
29 August, 2017
Happy Birthday Linux!
Happy Birthday Linux! Over a quarter (26yrs) of a century old, still going strong and conquering area after area in various technology-segments 😋😊😃
Yes, I know he didn't announce it on newsgroups this day, he did on August 25th.
But, as he says himself, he considers both dates as birthdays. The first as the announcement-date, and the second (today, August 29th) as the upload-date.
15 June, 2017
Back 2 School
Now, I'm attending the University of Stavanger (UiS).
06 June, 2017
Retro-gaming
I've been itching to write this blogpost for a while...
I'm an avid retro-gamer, as well as a contemporary gamer. My emulation-antics took off in the late 90s when I started getting nostalgic about old DOS-games from the late 80s and early 90s. For the most part, DOS-emulation was pretty accurate even in the early days.
But these days emulation is pretty much a native thing. CPU-cycle-imitation and other emulation-techniques have pretty much reached the runtime-levels of the actual systems they are emulating.
I don't favor the new idea of releasing limited special-editions of consoles, like the NES classic mini and the SNES classic mini, when you can build a COTS-computer (or even use a deprecated laptop/desktop), load it up with RetroPie and ROMs, and you basically have a multi-emulation box that runs EVERYTHING, and can be CUSTOMIZED.
If you go for a Raspberry Pi 3, such a system could even cost you as low as $30 (apart from cables and gamepads / arcade-sticks, then it would cost you a minimum of $50).
Sure, a few people argue that the RPi3 is a fad, or that the SD-card gets worn out so it's not made to last, etc.
Well, a $30 credit-card computer isn't really that hard to replace, and SD-cards are a dime a dozen these days and are even getting cheaper, not to mention quite easy to back up (making a copy-image on your computer harddrive / usb-drive).
Sure you cannot for example use exotic console-hardware, like the microphone gamepad for the Japanese NES. Nor can you play roms made from FX-gamepaks (Nintendo addon-technology to render pseudo-3D on the SNES). Then again, would you want to? Hey, if that's your thing, have at it. I don't give a flying f**k...
Emulation is king - imho.
11 May, 2017
Phantom mic-mixer
The Behringer Xenyx302USB is a nifty little piece of audio-hardware :) 5 channel mixer, Phantom-powered (48V) XLR microphone-input / JACK-input, line-input and 2-track input coupled with an USB audio-interface (full duplex input/output).
A compact and really portable design, weighs nothing and sports a sturdy build-quality I'm not used to seeing in equipment in this price-range. Although the mic-input could really use a filter (it picks up every hiss and click in the room), this can easily be remedied with a standalone hum-eliminator box.
All-round it is a great choice for podcasting, amateur recording, limited input-mixing, video / audio conferences and the like.
18 April, 2017
RetroPie
Since I started dabbling in Raspberry Pi SBC's, I've been testing a few emulation-distros and the like. Been playing around with this since June 2016, so a good 10 months by now I would say :P
One particular distro caught my interest while testing; namely RetroPie.
Excerpt from retropie.org.uk:
RetroPie allows you to turn your Raspberry Pi, ODroid C1/C2, or PC into a retro-gaming machine. It builds upon Raspbian, EmulationStation, RetroArch and many other projects to enable you to play your favourite Arcade, home-console, and classic PC games with the minimum set-up.
RetroPie lets you play virtually ALL arcade-/console-/PC-games released in the period 1980-2000. Every .rom I've got in my library (accumulated around 10.000+ since the 1990s) works out-of-the-box.
Tested a good few arcade- and console-roms, and they work flawlessly.
Only configuration it may require is a simple dialogue-based gamepad/joystick setup. And you're ready to game as you please :)
06 April, 2017
HiFiBerry Digi+ Pro (for RPi)
Recently got a really nifty Pi-hat (addon-card for Raspberry Pi) called "HiFiBerry" - model "Digi+ Pro".
"Pi-hats" are simple single-PCB addon-cards that slot on top of Raspberry Pi line of single-board computers, and extends the functionality of the RPi's. The HiFiBerry slots easily on top of the GPIO pin-row, with the help of plastic risers (the white bolts/nuts pictured below).
With the HiFiBerry Digi+ Pro, I get audio-output in 24bit 96KHz "studio-quality". NiCE! :)
06 December, 2016
Online VS Real Life
"In reality, politics have straddled the digital and meatspace for decades. Though government officials may have just learned about "the cyber," people working in computer security have been dealing with criminal and whimsical incursions into their systems since the late 20th century. It was 1990 when the infamous Operation Sundevil swept up innocents in a massive Secret Service dragnet operation to stop carders. The Stuxnet worm, which affected physical operations of centrifuges at a uranium enrichment plant in Iran, is only the most obvious example of how digital ops can have consequences away from the keyboard."
--Annalee Newits, arstechnica.com
http://arstechnica.com/staff/2016/12/stop-pretending-theres-a-difference-between-online-and-real-life/
American documentary, regarding "Stuxnet":
https://www.youtube.com/watch?v=yY6kZlLqDhE
21 October, 2016
Dyn (Managed DNS) DDoS...
Yes, my sites were affected by this outage. A lot of big-name sites were too (Twitter, Spotify, github, SoundCloud, Reddit, etc.).
And no, I don't use DNS management software like Netflix' Denominator, I really don't have a need for 24/7 uptime on any of my sites / sub-domains. They are strictly for demonstration- / entertainment- and hobby-purposes.
http://www.zdnet.com/article/dyn-ddos-part-2-the-hackers-strike-back/
18 October, 2016
08 July, 2016
Console-gaming
Two words: corrupt, busted.
I lost all my consoles in a lightning-strike back in March, and after using most of the money meant for replacement-consoles on my computer-upgrade instead; I'm NEVER going back to console-gaming ever again...
They're completely overpriced, the games are as well and games/apps get slower patching than any other gaming-platform.
Not to mention the hardware is usually 2-3 years behind current PC-hardware, and usually under-powered as fuck. Actually, current-gen consoles (PS4 / Xbone) also utilize "Radeon-based" graphics (graphic-accelerators), which is severely sub-par compared to almost any other graphic-solution.
Nope!, spank you very much...
FUCK. CONSOLES.
29 June, 2016
Nvidia graphics-accelerators
After being an AMD fanboy for a long time, I must admit Nvidia has the graphics market quite cornered.
My latest upgrade included an EVGA GeForce GTX 960 4GB GDDR5 SSC ACX2.0+
The one I had before that could barely grasp on the gaming-development cycle of it's time, yet alone the newer developments.
So, after a lot of researching and testing at friend's places, I concluded Nvidia has developed the superior gaming-tool(s). PhysX has no rival technologies (in PC hardware at least), as for the newer tech: HairWorks and GameWorks, I have no special opinions. They're there. Nuff said.
I'm rather more interested in the CUDA-cores and their hardware-accelerators, more specifically NVENC the hardware-accelerated HDV-encoder.
Not to mention all the recent "woo-haw" around VR.