Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

November 4, 2015

It’s Official! Hell Has Frozen Over!

Filed under: .Net,Microsoft,OpenShift,Red Hat — Patrick Durusau @ 1:23 pm

Microsoft and Red Hat to deliver new standard for enterprise cloud experiences

From the news release:

Microsoft Corp. (Nasdaq “MSFT”) and Red Hat Inc. (NYSE: RHT) on Wednesday announced a partnership that will help customers embrace hybrid cloud computing by providing greater choice and flexibility deploying Red Hat solutions on Microsoft Azure. As a key component of today’s announcement, Microsoft is offering Red Hat Enterprise Linux as the preferred choice for enterprise Linux workloads on Microsoft Azure. In addition, Microsoft and Red Hat are also working together to address common enterprise, ISV and developer needs for building, deploying and managing applications on Red Hat software across private and public clouds.

I can’t report on the webcast because it requires Flash 10 and I don’t have that on a VM at the moment. Good cyber hygiene counsels against running even “patched” Adobe Flash.

The news release has the key points anyway:


Red Hat solutions available natively to Microsoft Azure customers. In the coming weeks, Microsoft Azure will become a Red Hat Certified Cloud and Service Provider, enabling customers to run their Red Hat Enterprise Linux applications and workloads on Microsoft Azure. Red Hat Cloud Access subscribers will be able to bring their own virtual machine images to run in Microsoft Azure. Microsoft Azure customers can also take advantage of the full value of Red Hat’s application platform, including Red Hat JBoss Enterprise Application Platform, Red Hat JBoss Web Server, Red Hat Gluster Storage and OpenShift, Red Hat’s platform-as-a-service offering. In the coming months, Microsoft and Red Hat plan to provide Red Hat On-Demand — “pay-as-you-go” Red Hat Enterprise Linux images available in the Azure Marketplace, supported by Red Hat.

Integrated enterprise-grade support spanning hybrid environments. Customers will be offered cross-platform, cross-company support spanning the Microsoft and Red Hat offerings in an integrated way, unlike any previous partnership in the public cloud. By co-locating support teams on the same premises, the experience will be simple and seamless, at cloud speed.

Unified workload management across hybrid cloud deployments. Red Hat CloudForms will interoperate with Microsoft Azure and Microsoft System Center Virtual Machine Manager, offering Red Hat CloudForms customers the ability to manage Red Hat Enterprise Linux on both Hyper-V and Microsoft Azure. Support for managing Azure workloads from Red Hat CloudForms is expected to be added in the next few months, extending the existing System Center capabilities for managing Red Hat Enterprise Linux.

Collaboration on .NET for a new generation of application development capabilities. Expanding on the preview of .NET on Linux announced by Microsoft in April, developers will have access to .NET technologies across Red Hat offerings, including Red Hat OpenShift and Red Hat Enterprise Linux, jointly backed by Microsoft and Red Hat. Red Hat Enterprise Linux will be the primary development and reference operating system for .NET Core on Linux.

More details at: The Official Microsoft Blog and the Red Hat Blog.

I first saw this in The Power of Open Source… Microsoft .NET and OpenShift by Chris Morgan.

A small pebble in an ocean of influences and motivations but treating Microsoft fairly during the ISO process for ISO 29500 (I am the editor of the competing ISO 26300) wasn’t a bad idea.

May 8, 2014

Functional Programming in the Cloud:…

Filed under: Cloud Computing,Functional Programming,Haskell,OpenShift,Red Hat — Patrick Durusau @ 1:07 pm

Functional Programming in the Cloud: How to Run Haskell on OpenShift by Katie Miller.

From the post:

One of the benefits of Platform as a Service (PaaS) is that it makes it trivial to try out alternative technology stacks. The OpenShift PaaS is a polyglot platform at the heart of a thriving open-source community, the contributions of which make it easy to experiment with applications written in a host of different programming languages. This includes the purely functional Haskell language.

Although it is not one of the Red Hat-supported languages for OpenShift, Haskell web applications run on the platform with the aid of the community-created Haskell cartridge project. This is great news for functional programming (FP) enthusiasts such as myself and those who want to learn more about the paradigm; Haskell is a popular choice for learning FP principles. In this blog post, I will discuss how to create a Haskell web application on OpenShift.

Prerequisites

If you do not have an OpenShift account yet, sign up for OpenShift Online for free. You’ll receive three gears (containers) in which to run applications. At the time of writing, each of these free gears come with 512MB of RAM and 1GB of disk space.

To help you communicate with the OpenShift platform, you should install the RHC client tools on your machine. There are instructions on how to do that for a variety of operating systems at the OpenShift Dev Center. Once the RHC tools are installed, run the command rhc setup on the command line to configure RHC ready for use.

Katie’s post is a great way to get started with OpenShift!

However, it also reminds me of why I dislike Daylight Savings Time. It is getting dark later in the Eastern United States but there are still only twenty-four (24) hours in a day! An extra eight (8) hours a day and the stamina to stay awake for them would be better. 😉

Unlikely to happen so enjoy Katie’s post during the usual twenty-four (24) hour day.

February 13, 2014

12 Steps For Teaching…

Filed under: OpenShift,Red Hat,Teaching — Patrick Durusau @ 7:54 pm

12 Steps For Teaching Your Next Programming Class on OpenShift by Katie Miller.

From the post:

The OpenShift Platform as a Service (PaaS) is a valuable resource for running tutorials on web programming, especially if you have a limited budget.

OpenShift abstracts away configuration headaches to help students create shareable applications quickly and easily, for free, using extensible open-source code – as I explained in a previous post.

In this blog post, I will draw on my personal workshop experiences to outline 12 steps for teaching your next programming class with OpenShift Online.

See Katie’s post for the details but as a sneak preview, the twelve steps are:

  1. Try Out OpenShift
  2. Choose Topic Areas
  3. Select Cartridges to Support Your Teaching Goals
  4. Develop a Work Flow
  5. Create and Publish Sample Code Base
  6. Write Workshop Instructions
  7. Determine Account Creation Strategy
  8. Prepare Environments
  9. Trial Workshop
  10. Recruit Helpers
  11. Run Workshop
  12. Share Results and Seek Feedback

An excellent resource for teaching the techie side of semantic integration.

February 22, 2013

Hadoop Adds Red Hat [More Hadoop Silos Coming]

Filed under: Hadoop,MapReduce,Red Hat,Semantic Diversity,Semantic Inconsistency — Patrick Durusau @ 1:27 pm

Red Hat Unveils Big Data and Open Hybrid Cloud Direction

From the post:

Red Hat, Inc. (NYSE: RHT), the world’s leading provider of open source solutions, today announced its big data direction and solutions to satisfy enterprise requirements for highly reliable, scalable, and manageable solutions to effectively run their big data analytics workloads. In addition, Red Hat announced that the company will contribute its Red Hat Storage Hadoop plug-in to the ApacheTM Hadoop® open community to transform Red Hat Storage into a fully-supported, Hadoop-compatible file system for big data environments, and that Red Hat is building a robust network of ecosystem and enterprise integration partners to deliver comprehensive big data solutions to enterprise customers. This is another example of Red Hat’s strategic commitment to big data customers and its continuing efforts to provide them with enterprise solutions through community-driven innovation.

The more Hadoop grows, the more Hadoop silos will as well.

You will need Hadoop and semantic skills to wire Hadoop silos together.

Re-wire with topic maps to avoid re-wiring the same Hadoop silos over and over again.

I first saw this at Red Hat reveal big data plans, open sources HDFS replacement by Elliot Bentley.

June 24, 2012

Rise above the Cloud hype with OpenShift

Filed under: Cloud Computing,Red Hat — Patrick Durusau @ 1:29 pm

Rise above the Cloud hype with OpenShift by Eric D. Schabell.

From the post:

Are you tired of requesting a new development machine for your application? Are you sick of having to setup a new test environment for your application? Do you just want to focus on developing your application in peace without ‘dorking with the stack’ all of the time? We hear you. We have been there too. Have no fear, OpenShift is here!

In this article will walk you through the simple steps it takes to setup not one, not two, not three, but up to five new machines in the Cloud with OpenShift. You will have your applications deployed for development, testing or to present them to the world at large in minutes. No more messing around.

We start with an overview of what OpenShift is, where it comes from and how you can get the client tooling setup on your workstation. You will then be taken on a tour of the client tooling as it applies to the entry level of OpenShift, called Express. In minutes you will be off and back to focusing on your application development, deploying to test it in OpenShift Express. When finished you will just discard your test machine and move on. When you have mastered this, it will be time to ramp up into the next level with OpenShift Flex. This opens up your options a bit so you can do more with complex applications and deployments that might need a bit more fire power. After this you will be fully capable of ascending into the OpenShift Cloud when you chose, where you need it and at a moments notice. This is how development is supposed to be, development without stack distractions.

Specific to the Red Hat Cloud but that doesn’t trouble me if it doesn’t trouble you.

What is important is that like many cloud providers, the goal is to make software development in the cloud as free from “extra” concerns as possible.

Think of users who rely upon network based applications for word processing, spreadsheets, etc. Fewer of them would do so if every use of the application required steps that expose the network-based nature of the application. Users just want the application to work. (full stop)

A bit more of the curtain can be drawn back for developers but even there, the goal isn’t to master the intricacies of cloud computing but to produce robust applications that so happen to run on the cloud.

This is one small step towards a computing fabric where developers write and deploy software. (full stop) The details of where it is executed, where data is actually stored, being known only by computing fabric specialists. The application serves it users, produces the expected answers, delivers specified performance, what more do you need to know?

I would like to see topic maps playing a role in developing the transparency for the interconnected systems that grow into that fabric.

(I first saw this at DZone’s replication of the Java Code Geeks reposting at: http://www.dzone.com/links/r/rise_above_the_cloud_hype_with_openshift.html)

April 12, 2012

Red Hat and 10gen: Deeper collaboration around MongoDB

Filed under: MongoDB,Red Hat — Patrick Durusau @ 8:49 am

Red Hat and 10gen: Deeper collaboration around MongoDB

From the post:

Today [April 9, 2012], Red Hat and 10gen jointly announced a deeper collaboration around MongoDB. By combining Red Hat’s traditional strengths in operating systems and middleware with 10gen’s expertise in database technology, we’re developing a robust open source platform on which to develop and deploy your next generation of applications either in your own data centers or in the cloud.

Over the next several months, we’ll be working closely with Red Hat to optimize and integrate MongoDB with a number of Red Hat products. You can look at this effort resulting in a set of reference designs, solutions, packages and documentation for deploying high-performance, scalable and secure applications with MongoDB and Red Hat software. Our first collaboration is around a blueprint for deploying MongoDB on Red Hat Enterprise Linux, which we will release shortly. We’ll follow that up with a number of additional projects around RHEL, JBoss, Red Hat Enterprise Virtualization (RHEV), Cloud Forms, Red Hat Storage (GlusterFS), and of course continue the work we have started with OpenShift. We hope to get much involvement from the Red Hat and MongoDB communities, and any enhancements to MongoDB resulting from this work will, of course, be open sourced.

Have you noticed that open source projects are trending towards bundling themselves with each other?

A healthy recognition users want solutions over sporting with versions and configuration files.

Powered by WordPress