Although... we actually don't host Linux Presentation Day here in Norway, I'd like to make people aware of this event however possible. So to the few who actually reads this blog, you're welcome ;) :P
https://linux-presentation-day.org/
Although... we actually don't host Linux Presentation Day here in Norway, I'd like to make people aware of this event however possible. So to the few who actually reads this blog, you're welcome ;) :P
I was reading this article a couple of weeks ago, and sure was tempted in getting one...
Yes, I am a weak individual to g33k-marketing, I know this... 😅
So... I ended up shelling out the wet stinky, and it is on its way in the post 😋
![]() |
Ubuntu on Nintendo Switch |
Continuous Integration is the practice of constantly merging development work with a Master/Trunk/Mainline branch so that you can test changes and test that those changes work with other changes. The idea here is to test your code as often as possible so you can catch issues early on. In the continuous integration process, most of the work is done by an automated tests technique which requires a unit test framework. It is best practice to have a build server designed specifically for performing these tests so your development team can continue merging requests even while tests are being performed...Yes, automation here is key.
...Continuous Delivery is the continual delivery of code to an environment once the developer feels the code is ready to ship - this could be UAT (User Acceptance Testing), staging or production. The idea behind continuous delivery is that you’re constantly delivering code to a user base, whether it be QA or directly to customers for continual review and inspection. Although similar to continuous integration, continuous delivery differs because it can feed business logic tests where unit tests are unable to catch all business logic, particularly design issues.
...Continuous Deployment is the deployment or release of code to production as soon as it’s ready. There is no large batching in staging nor a long UAT (User Acceptance Testing) process before production. Any testing is done prior to merging to the Mainline branch and is performed on production-like environments. The production branch is always stable and ready to be deployed by an automated process. The automated process is key because it should be able to be performed by anyone in a matter of minutes (preferably by the press of a button).And after all that, log-auditing after deployment; checking key metrics if they are influenced negatively or positively by change(s).
Yes, it's been an often-discussed topic in Norwegian media in later years:
"Lack of security-professionals."
Well, as commented in this (Norwegian) article, BY a security-professional; there seems to be a lack of security-oriented IT professionals, but, not because they aren't there at all. They are. What is seriously lacking in this scenario, is competence in recruiting firms looking for this kind of competence. Always has been.
Computer-security is not a fixed-set field, AT ALL. Even though a lot of so-called "professionals" seem to be stuck on the idea that it is.
Serious professionals wanting to work in this field on the other hand, are (often) painfully aware of what it actually entails to do so:
http://www.businessinsider.com/microsoft-azure-sphere-is-powered-by-linux-2018-4?r=US&IR=T&IR=T"After 43 years, this is the first day that we are announcing, and will be distributing, a custom Linux kernel," Microsoft President Brad Smith said
Serverless computing refers to the concept of building and running applications that do not require server management. It describes a finer-grained deployment model where applications, bundled as one or more functions, are uploaded to a platform and then executed, scaled, and billed in response to the exact demand needed at the moment.http://www.zdnet.com/article/servers-we-dont-need-no-stinkin-servers/
Yeah, sometimes it does not warrant any extra security to be cutting edge... This I know.
#!/usr/bin/env bash
echo "cpuinfo : "$(cat /proc/cpuinfo | grep 'model name' | awk 'NR==1' | awk '{ print $4" "$5" "$6" "$7" "$8" "$9 }');
cat /proc/cpuinfo | grep -i bugs | head -1;
![]() |
Ubuntu Desktop |
![]() |
Linux Mint (MATE edition) |
Yes, seems I finally landed my dream-job =D It's not work, it's fun :)
I get to work with innovating and bleeding-edge technology every day, at my own terms, with all the tech-benefits I could wish for. Not to mention my contract-benefits, and office-benefits (sponsored gourmet coffee, UV-filtered water and Red Bull).
![]() |
"Red Bull-fridge" |
Robots and artificial intelligence create opportunities of enormous dimensions.
We have developed the market-leading virtual assistant, who has taken the European market by storm. The company is experiencing rapid growth and we are looking for new colleagues to join us at our office in Stavanger, Norway. Become a part of what the World Economic Forum calls "The Fourth Industrial Revolution"!
As an employee in our company, you become a part of a young and dynamic environment with a high degree of freedom for self-development. Your colleagues have exceptional expertise in data and technology, and all developers get access to state of the art equipment.
Our technology stack today consists of Linux, PostgreSQL, Apache, Varnish, Java, Spring, Grails, Groovy, Gradle, Python, Javascript, Lua, Torch, IntelliJ IDEA, Git and Amazon Web Services (hence the aws-amazon links above). We will use any required additional technologies in the future to solve whatever new challenges may arise.
The Linux Foundation reports that Linux runs 90 percent of the public cloud workload, 82 percent of the world's smartphones, 62 percent of the embedded market, oh and a mere 99 percent of the supercomputer market. All that rests on the Linux kernel.Good news :)
"The Linux kernel is one of the largest and most successful open-source projects that has ever come about. The huge rate of change and number of individual contributors show that it has a vibrant and active community, constantly causing the evolution of the kernel in response to number of different environments it is used in. This rate of change continues to increase, as does the number of developers and companies involved in the process; thus far, the development process has proved that it is able to scale up to higher speeds without trouble."
http://www.zdnet.com/article/whos-building-linux-in-2017/
Over the past year, the kernel has been updated through merged changesets, new drivers, hardening and testing.http://www.zdnet.com/article/microsoft-says-40-percent-of-all-vms-in-azure-now-are-running-linux/
http://sdtimes.com/report-interest-linux-kernel-remains-strong/
:P WAT? Green/Blue?
My new job requires a lot of automation set in place for us in DevOps to be able to manage the huge deployment-workloads we get. I am currently in the process of automating our setup to a degree where we only have to initiate deployment-processes without having to monitor them as they execute :P at least compared to before I came inBlue/Green Deployment in Datacenter
How it Works
In the "blue-green" model, two identical application stacks are maintained (for convenience, I will refer to our application server instances and service instances as a stack, database deployments are done separately . One of the stacks is always live, let us say green is live and blue is stand-by.
When we want to deploy a new build to production, we deploy it on blue stack and do thorough testing. Once new build seems to work fine, the blue stack goes live and green becomes standby. This helps us do a quick rollback if issues are identified during deployment or sanity check.
What are the challenges?
The tricky bit here is to switch between blue and green stacks with almost no downtime.
An overhead here is to maintain two stacks and hence adds to the cost.
https://medium.com/devops-learnings/blue-green-deployments-in-data-center-e704f93f9f70
I just got an awaited shipment from the UK:
![]() |
Amiga "Rainbow Double-Check"-logo T-shirt |
![]() |
Workbench | 'Insert diskette' screen T-shirt |
![]() |
Commodore Amiga 500 - 'System 1' |
![]() |
Commodore Amiga Workbench 1.3 (w/"Boing" running in back) |
![]() |
Commodore Amiga 'Insert diskette' load-screen |
![]() |
Commodore Amiga Guru Meditation The Amiga equivalent of a BSOD on Windows and kernel panic on Linux |
![]() |
Commodore Amiga external Power Supply Unit |
![]() |
Amiga Unix - System V Release 4 Amiga Version 2.0 Boot / Root installation-diskettes & tape |
Happy Birthday Linux! Over a quarter (26yrs) of a century old, still going strong and conquering area after area in various technology-segments 😋😊😃
I've been itching to write this blogpost for a while...
I'm an avid retro-gamer, as well as a contemporary gamer. My emulation-antics took off in the late 90s when I started getting nostalgic about old DOS-games from the late 80s and early 90s. For the most part, DOS-emulation was pretty accurate even in the early days.
But these days emulation is pretty much a native thing. CPU-cycle-imitation and other emulation-techniques have pretty much reached the runtime-levels of the actual systems they are emulating.
I don't favor the new idea of releasing limited special-editions of consoles, like the NES classic mini and the SNES classic mini, when you can build a COTS-computer (or even use a deprecated laptop/desktop), load it up with RetroPie and ROMs, and you basically have a multi-emulation box that runs EVERYTHING, and can be CUSTOMIZED.
If you go for a Raspberry Pi 3, such a system could even cost you as low as $30 (apart from cables and gamepads / arcade-sticks, then it would cost you a minimum of $50).
Sure, a few people argue that the RPi3 is a fad, or that the SD-card gets worn out so it's not made to last, etc.
Well, a $30 credit-card computer isn't really that hard to replace, and SD-cards are a dime a dozen these days and are even getting cheaper, not to mention quite easy to back up (making a copy-image on your computer harddrive / usb-drive).
Sure you cannot for example use exotic console-hardware, like the microphone gamepad for the Japanese NES. Nor can you play roms made from FX-gamepaks (Nintendo addon-technology to render pseudo-3D on the SNES). Then again, would you want to? Hey, if that's your thing, have at it. I don't give a flying f**k...
Emulation is king - imho.