A cloud engineer is an IT professional responsible for any technological duties associated with cloud computing, including design, planning, management, maintenance and support.
The cloud engineer position can be broken into multiple roles, including cloud architect, cloud software engineer, cloud security engineer, cloud systems engineer and cloud network engineer. Each position focuses on a specific type of cloud computing, rather than the technology as a whole. Companies that hire cloud engineers are often looking to deploy cloud services or further their cloud understanding and technology.
DevOps is the combination of cultural philosophies, practices, and tools that increases an organization’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organizations using traditional software development and infrastructure management processes. This speed enables organizations to better serve their customers and compete more effectively in the market.
Robots and artificial intelligence create opportunities of enormous dimensions. We have developed the market-leading virtual assistant, who has taken the European market by storm. The company is experiencing rapid growth and we are looking for new colleagues to join us at our office in Stavanger, Norway. Become a part of what the World Economic Forum calls "The Fourth Industrial Revolution"! As an employee in our company, you become a part of a young and dynamic environment with a high degree of freedom for self-development. Your colleagues have exceptional expertise in data and technology, and all developers get access to state of the art equipment. Our technology stack today consists of Linux, PostgreSQL, Apache, Varnish, Java, Spring, Grails, Groovy, Gradle, Python, Javascript, Lua, Torch, IntelliJ IDEA, Git and Amazon Web Services (hence the aws-amazon links above). We will use any required additional technologies in the future to solve whatever new challenges may arise.
The Linux Foundation reports that Linux runs 90 percent of the public cloud workload, 82 percent of the world's smartphones, 62 percent of the embedded market, oh and a mere 99 percent of the supercomputer market. All that rests on the Linux kernel.
"The Linux kernel is one of the largest and most successful open-source projects that has ever come about. The huge rate of change and number of individual contributors show that it has a vibrant and active community, constantly causing the evolution of the kernel in response to number of different environments it is used in. This rate of change continues to increase, as does the number of developers and companies involved in the process; thus far, the development process has proved that it is able to scale up to higher speeds without trouble."
In the "blue-green" model, two identical application stacks are maintained (for convenience, I will refer to our application server instances and service instances as a stack, database deployments are done separately . One of the stacks is always live, let us say green is live and blue is stand-by.
When we want to deploy a new build to production, we deploy it on blue stack and do thorough testing. Once new build seems to work fine, the blue stack goes live and green becomes standby. This helps us do a quick rollback if issues are identified during deployment or sanity check.
What are the challenges?
The tricky bit here is to switch between blue and green stacks with almost no downtime.
An overhead here is to maintain two stacks and hence adds to the cost.
My new job requires a lot of automation set in place for us in DevOps to be able to manage the huge deployment-workloads we get. I am currently in the process of automating our setup to a degree where we only have to initiate deployment-processes without having to monitor them as they execute :P at least compared to before I came in
After re-playing a heap of my all-time favorite Amiga-games on my RetroPie (RPi3), I thought I'd do a little post honoring the 80s/90s gaming-computer. Mainly, the Amiga 500.
No other computer-system has ever had this big an impact on my life, my preferences and my esthetic sense.
To give a little intro about the Amiga 500: it was launched as a compact, home game-computer in the mid-1980s. The Amiga 500 had everything from motherboard/CPU to add-on-chips and keyboard integrated into one plastic casing.
People mostly hooked it up to their TV with an extra peripheral called the RF-modulator, but was also sold with an Commodore-branded CRT-monitor (see picture below), that even sported integrated stereo speakers(!).
Commodore Amiga 500 - 'System 1'
At the time (late 80s) Amiga was unmatched in animation and sound, mostly due to their use of specialty chain-chips (separate chips co-working in a process-chain), but also thanks to a massive (for the time) homebrew-scene (the demo-scene).
The Amiga 500 demonstrated graphics-powers unmatched by similar 8-/16-bit systems at the time. As the Amiga 500 had over 300.000 units sold by 1990, it had established itself as a massively popular home-computer across Europe. But due to scepticism and poor advertising, not so good in the US.
Amiga set itself apart from other computer systems, just like Apple. For example, the PC-dominating three-finger salute, is performed differently on Amiga-systems: Ctrl + Left Amiga + Right Amiga.
It also sported co-processors (additional processors other than the CPU) for various tasks that made this computer way ahead of its time. Today's computers--for example--almost always include co-processors (like GPUs), much like the Amiga series did back in the 80s and 90s. Only difference being that the co-processors in the Amigas were DIP-chips soldered directly to the actual motherboard, whereas modern GPUs are individually contained system-cards, sporting their own FPU-chips+G-RAM that just slot into modern motherboard PCI-express slots and uses the motherboard north/south bridge-connections to communicate with the CPU(s) / RAM.
The 90s brought the demo-scene community that grew out of the 80s, eagerly showcasing the Amiga as a competitive computer-system by making byte-sized (68000 assembly-coded), elaborate animated/scored demonstration-applications, coined "demos", thus; the demo-scene.
One of my most memorable demos was this "cracktro" by Unit A:
Another more recent (2010) demo by Razor1911:
The Amiga was also one of the last, true ROM-based computers (the operating system was located on a ROM-chip on-board, much like our Android/iOS-smartphones today, and other micro-computers of the late 80s (Apple), hard-drives cost an arm and a leg in those days, very expensive) that sported accompanying diskette(s) containing the Workbench graphical desktop environment by itself. Original, to say the least.
The Amiga Workbench was a multi-tasking desktop environment. Shown running a background-application (Boing) behind the Workbench-UI.
Commodore Amiga Workbench 1.3 (w/"Boing" running in back)
It also loaded and ran games straight from 3½ inch floppy disks (usually 880 KB format). The game-data was loaded from the floppy into Amiga-memory and consequently executed.
The picture below shows the Amiga-screen that shows if you didn't load Workbench from floppy, indicating you had to load a floppy for something to happen.
Commodore Amiga 'Insert diskette' load-screen
Commodore Amiga Boing Ball, animated gif representation --------
It was a major feat when made in the 80s. One animation done in bit-plane, one sound-sample run at varying speeds and sample-rates. All-together it made an animated scene with a bouncing ball in a pseudo-3d grid-room.
Yes, I'm slow at adopting new tech nowadays. Finally got updated to Android v7.x/8.x. Got my step-brothers' old Nexus 5X, and yes; I know this particular model has hardware-issues.
This one however has been fixed. Faulty components have been replaced and a new warranty issued.
I didn't really worry about using an outdated phone (Nexus 5), since I had secured it as much as I would "be allowed". Yes, allowed. Our corporate overlords control more than you might think...
Happy Birthday Linux! Over a quarter (26yrs) of a century old, still going strong and conquering area after area in various technology-segments 😋😊😃
Yes, I know he didn't announce it on newsgroups this day, he did on August 25th.
But, as he says himself, he considers both dates as birthdays. The first as the announcement-date, and the second (today, August 29th) as the upload-date.
I've been itching to write this blogpost for a while...
I'm an avid retro-gamer, as well as a contemporary gamer. My emulation-antics took off in the late 90s when I started getting nostalgic about old DOS-games from the late 80s and early 90s. For the most part, DOS-emulation was pretty accurate even in the early days.
But these days emulation is pretty much a native thing. CPU-cycle-imitation and other emulation-techniques have pretty much reached the runtime-levels of the actual systems they are emulating.
I don't favor the new idea of releasing limited special-editions of consoles, like the NES classic mini and the SNES classic mini, when you can build a COTS-computer (or even use a deprecated laptop/desktop), load it up with RetroPie and ROMs, and you basically have a multi-emulation box that runs EVERYTHING, and can be CUSTOMIZED.
If you go for a Raspberry Pi 3, such a system could even cost you as low as $30 (apart from cables and gamepads / arcade-sticks, then it would cost you a minimum of $50).
Sure, a few people argue that the RPi3 is a fad, or that the SD-card gets worn out so it's not made to last, etc.
Well, a $30 credit-card computer isn't really that hard to replace, and SD-cards are a dime a dozen these days and are even getting cheaper, not to mention quite easy to back up (making a copy-image on your computer harddrive / usb-drive).
Sure you cannot for example use exotic console-hardware, like the microphone gamepad for the Japanese NES. Nor can you play roms made from FX-gamepaks (Nintendo addon-technology to render pseudo-3D on the SNES). Then again, would you want to? Hey, if that's your thing, have at it. I don't give a flying f**k...
The Behringer Xenyx302USB is a nifty little piece of audio-hardware :) 5 channel mixer, Phantom-powered (48V) XLR microphone-input / JACK-input, line-input and 2-track input coupled with an USB audio-interface (full duplex input/output).
A compact and really portable design, weighs nothing and sports a sturdy build-quality I'm not used to seeing in equipment in this price-range. Although the mic-input could really use a filter (it picks up every hiss and click in the room), this can easily be remedied with a standalone hum-eliminator box.
All-round it is a great choice for podcasting, amateur recording, limited input-mixing, video / audio conferences and the like.
TIP: For those of you out there considering getting a NES Classic / SNES Classic read this blogpost first... you won't be sorry.
Since I started dabbling in Raspberry Pi SBC's, I've been testing a few emulation-distros and the like. Been playing around with this since June 2016, so a good 10 months by now I would say :P
One particular distro caught my interest while testing; namely RetroPie.
Excerpt from retropie.org.uk:
RetroPie allows you to turn your Raspberry Pi, ODroid C1/C2, or PC into a retro-gaming machine. It builds upon Raspbian, EmulationStation, RetroArch and many other projects to enable you to play your favourite Arcade, home-console, and classic PC games with the minimum set-up.
RetroPie lets you play virtually ALL arcade-/console-/PC-games released in the period 1980-2000. Every .rom I've got in my library (accumulated around 10.000+ since the 1990s) works out-of-the-box.
Tested a good few arcade- and console-roms, and they work flawlessly.
Only configuration it may require is a simple dialogue-based gamepad/joystick setup. And you're ready to game as you please :)
Recently got a really nifty Pi-hat (addon-card for Raspberry Pi) called "HiFiBerry" - model"Digi+ Pro".
"Pi-hats" are simple single-PCB addon-cards that slot on top of Raspberry Pi line of single-board computers, and extends the functionality of the RPi's. The HiFiBerry slots easily on top of the GPIO pin-row, with the help of plastic risers (the white bolts/nuts pictured below).
With the HiFiBerry Digi+ Pro, I get audio-output in 24bit 96KHz "studio-quality". NiCE! :)
"In reality, politics have straddled the digital and meatspace for decades. Though government officials may have just learned about "the cyber," people working in computer security have been dealing with criminal and whimsical incursions into their systems since the late 20th century. It was 1990 when the infamous Operation Sundevil swept up innocents in a massive Secret Service dragnet operation to stop carders. The Stuxnet worm, which affected physical operations of centrifuges at a uranium enrichment plant in Iran, is only the most obvious example of how digital ops can have consequences away from the keyboard."
Yes, my sites were affected by this outage. A lot of big-name sites were too (Twitter, Spotify, github, SoundCloud, Reddit, etc.).
And no, I don't use DNS management software like Netflix'Denominator, I really don't have a need for 24/7 uptime on any of my sites / sub-domains. They are strictly for demonstration- / entertainment- and hobby-purposes.
I lost all my consoles in a lightning-strike back in March, and after using most of the money meant for replacement-consoles on my computer-upgrade instead; I'm NEVER going back to console-gaming ever again...
They're completely overpriced, the games are as well and games/apps get slower patching than any other gaming-platform.
Not to mention the hardware is usually 2-3 years behind current PC-hardware, and usually under-powered as fuck. Actually, current-gen consoles (PS4 / Xbone) also utilize "Radeon-based" graphics (graphic-accelerators), which is severely sub-par compared to almost any other graphic-solution.
After being an AMD fanboy for a long time, I must admit Nvidia has the graphics market quite cornered.
My latest upgrade included an EVGA GeForce GTX 960 4GB GDDR5 SSC ACX2.0+
The one I had before that could barely grasp on the gaming-development cycle of it's time, yet alone the newer developments.
So, after a lot of researching and testing at friend's places, I concluded Nvidia has developed the superior gaming-tool(s). PhysX has no rival technologies (in PC hardware at least), as for the newer tech: HairWorks and GameWorks, I have no special opinions. They're there. Nuff said.
I'm rather more interested in the CUDA-cores and their hardware-accelerators, more specifically NVENC the hardware-accelerated HDV-encoder.
Not to mention all the recent "woo-haw" around VR.
Because of a recent lightning-strike, my gaming-rig literally spiked... and died...
But, thanks to very nice family-members I got replacement components, very quickly ;) :P since I was getting quite a few bucks in insurance (eventually, may'16) I decided to completely upgrade the whole rig.
After spending an afternoon researching (after months of contemplating on models and makes), I eventually landed on the following choices:
Component selection
Gigabyte Z170MX-Gaming 5 motherboard
2 x 8GB Kingston HyperX Fury DDR4 2666MHz RAM-modules
No fight to assemble the beast, at least not a lot. The CPU-cooler took some getting used to only having to screw together the mount(s), and pressing it into the motherboard (Cooler Master's *brilliant* Intel-type push-plugs -_-).
All components assembled into north+south-bridge(s)
It also helps to be prepared for a future lightning-strike, by using a 80-PLUS-Gold-certified PSU that sports: over-voltage protection, under-voltage protection, short circuit protection and over power protection.
Corsair RM850x PSU (Power Supply Unit)
Combined with a power-strip sporting surge protection, I'd say I'm much better equipped for the ominous scandinavian weather-system now, than before.
Giving a combined performance-boost of ~50% in heavy 3D and rendering cases :) and (roughly) around a 45-50% reduction in heavy-load temperatures as well as idle temperatures (Intel < AMD), I'd say I'm really pleased;)
Quite the OP setup, at least compared to my earlier rig (h3x). Geekbench3-results confirm this:
quad-g5 geekbench3 multicore-score: 13380 ('2016)
h3x geekbench3 multicore-score: 8305 ('2013)
*** FINGERS CROSSED ***
Update May 11th:
Added extra 92mm Cooler Master cpu-fan as exhaust-booster
I didn't really think much about it, but when I started using "InSync" a few years ago, all my woes using Google Drive on desktop-Linux vanished :)
It is simply the best client-side synchronization-tool for desktop use of Google's cloud-storage solution ("Drive") on Linux today. Both simple and quite configurable at the same time.
It is cross-platform compatible, i.e.: works on Windows, Mac OS X and Linux!
Only drawback is that it costs a one-off fee of $20, but after that, you can use it on as many machines as you want :P
It basically works just like Dropbox (which actually has a Linux-native client! Boo Google!), allowing the user to maintain his file system and offering share options as well as other features by right clicking on files and widgets.
Yet another period of contracting-work is now at an end. It was a good, solid 10 months :) New experiences - new technology.
Being able to join the starting effort at building something from the ground up is satisfying work :) especially as a technophile setting up core infrastructure and backend(s) for mission critical services.
As with all specialization; if the competence is hard to find, the need tends to round-robin back to the starting point again.
Good fun :P
What new adventures and / or challenges awaits?
Who knows... but I bet they're right around the corner ;) they always are.
Besides, now I have the spare time to pursue hobbies and interests again ^_^
My 7-8 year-old Logitech gamepads have been in heavy use over the years. Aaand they just recently crapped out.
Luckily I had put their new gamepad-series on my christmas wishlist, and as I expected, got one from a family member and one from my GF :P 2P-gaming!
I also got a steelseries gaming-mousepad, and I must admit, at first I didn't really get what the big deal was. But after playing some FPS- and RPS-games, I totally get it. Nothing else can compare to the sensitivity and movement-feel you get with these mousepads, nothing.
Been pretty productive (read: teh buzy) the last 6-7 months, doing some (quite heavy) service- and maintenance-work for a local IT-startup.
Been responsible for the x86_64 Linux-rack(s) based infrastructure and backend driving the various product-frameworks, the domain- and network-security, plus client host-security throughout the organization. Quite a lot of networking and some aggregated WiFi-APs etc.
The Linux-rack servers have also been host-hardened and properly secured (SSH/TLS/SSL) for the threats facing online services in 2015.
Without going into too much detail, it all revolves around embedded GPS-tracking, web-apps for viewing and controlling said- tracking system, and the accompanying maintenance / service / troubleshooting involved in said systems and their server-backends.
The servers run both the MariaDB and the PostgreSQL database-systems for serving application- / geo- / PostGIS-data in various parts of the application-flow. I became responsible for servicing / doing maintenance of and fascilitating import / export of SQL-data for backup(s), re-location, and the like. The PostgreSQL database is even served from an iSCSI pool through a dedicated jumbo-frame ethernet connection to a NAS-rack for extra speed when handling huge datasets.
I've also been playing around with the web-framework, doing some development for myself to learn the technology, and also fixing various components in the active applications.
I've known for some time now that AMD has a serious lack of effort on their native implementation of OpenGL on Linux, especially when used for gaming. Not to mention their proprietary (binary) graphics-driver: "AMD Catalyst", which is a story all on it's own...
They JUST RECENTLY aqcuired OpenGL v4.5-compatibility, which by now is over 1 year old! In technology-tems, that's just sad.
They released ('2013 for the wider public) an in-house developed perf-tool (GPUPerfServer2), for optimizing Linux games using OpenGL. Though they did *NOT* release a linux client for this client/server framework; only a linux-based server for running local OpenGL-games, of which they could then connect to via the OSX- / Windows-client (figures...).
A couple of days ago, I stumbled upon this picture in a Google+ post with the following title "Feral Interactive Buys AMD Hardware To Optimize Linux Games":