Collette Stumpf is a software designer at Surge.
Successful software projects please customers, streamline processes, or otherwise add value to your business. But how do you ensure that your software project will result in the improvements you are expecting? Will users experience better performance? Will the productivity across all tasks improve as you hoped? Will users be happy with your changes and return to your product again and again as you envisioned?
AI’s rapid evolution is producing an explosion in new types of hardware accelerators for machine learning and deep learning.
Some people refer to this as a “Cambrian explosion,” which is an apt metaphor for the current period of fervent innovation. It refers to the period about 500 million years ago when essentially every biological “body plan” among multicellular animals appeared for the first time. From that point onward, these creatures—ourselves included—fanned out to occupy, exploit, and thoroughly transform every ecological niche on the planet.
A distributed file system, a MapReduce programming framework, and an extended family of tools for processing huge data sets on large clusters of commodity hardware, Hadoop has been synonymous with “big data” for more than a decade. But no technology can hold the spotlight forever.
While Hadoop remains an essential part of the big data platforms, and the major Hadoop vendors—namely Cloudera, Hortonworks, and MapR—have changed their platforms dramatically. Once-peripheral projects like Apache Spark and Apache Kafka have become the new stars, and the focus has turned to other ways to drill into data and extract insight.[ The essentials from InfoWorld: What is Apache Spark? The big data analytics platform explained • Spark tutorial: Get started with Apache Spark • What is data mining? How analytics uncovers insights. | Cut to the key news and issues in cutting-edge enterprise technology with the InfoWorld Daily newsletter. ]
Let’s take a brief tour of the three leading big data platforms, what each adds to the mix of Hadoop technologies to set it apart, and how they are evolving to embrace a new era of containers, Kubernetes, machine learning, and deep learning.
Anaconda, the Python language distribution and work environment for scientific computing, data science, statistical analysis, and machine learning, is now available in version 5.2, with additions to both its enterprise and open-source community editions.[ Tutorial: How to get started with Python. | Go deeper with the InfoWorld megaguide: The best Python frameworks and IDEs. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ] Where to download Anaconda 5.2
The community edition of Anaconda Distribution is available for free download directly from Anaconda’s website. The for-pay enterprise edition, with professional support, requires contacting the Anaconda (formerly Continuum Analytics) sales team.
Version 2.14 of GitHub Enterprise, the behind-the-firewall version of GitHub’s code-sharing platform tuned for businesses, improvement configuration visibility and adds anonymous Git read access.
Users can configure visibility for new members of an organization, across private or public instances. Administrators also can prevent users from changing their visibility from the default configuration. Default settings can be enforced through a command-line utility.[ So, just what is GitHub, exactly? • GitHub tutorial: Get started with GitHub. • 20 essential pointers for Git and GitHub. • What’s new in GitHub’s Atom text editor. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
GitHub Enterprise Version 2.14 also adds the ability for administrators to enable anonymous Git read access to public repositories when an instance is in a private mode. Anonymous read access can let users bypass authentication requirements for custom tools on an instance.
Azure’s service platform’s adoption of Kubernetes and containerschanges how you build, deploy, and manage cloud-native applications, treating containers and services as the targets of your builds, rather than the code that makes up those services.
Kubernetes itself automates much of what had been infrastructure tasks, orchestrating and managing containers. Azure’s AKS tools simplify configuring Kubernetes, but you need to deploy straight into an AKS instance—a hurdle for anyone developing new apps or handling a migration of an existing service. Although AKS itself isn’t expensive, setting up and tearing down orchestration models takes time—time that can better be spent writing and debugging code.
How much does you’re public cloud cost month to month? If you don’t know, you’re hardly alone. Most people in IT don’t have a good understand of what a public cloud service costs per month. Most wait to find out what the bill says rather than proactively monitor cloud consumption, much less have cloud cost governance in place.
Even if your financial budgeting model can handle uncertain costs, not knowing what you’re spending has a downside. When you moved to the public cloud, your company put a value driver in place when defining the business cases—and part of that was based on ongoing costs per month.[ Get started: Azure cloud migration guide. • Tutorial: Get started with Google Cloud. | Keep up with the latest developments in cloud computing with InfoWorld’s Cloud Computing newsletter. ]
If those costs are higher than originally estimated, the value metrics won’t support your goals. Although you can make a case for the cloud’s value around agility and compressing time to market, that will fall on deaf ears among your business leaders if you’re 20 to 30 percent over budget for ongoing cloud costs.
Deep learning is an important part of the business of Google, Amazon, Microsoft, and Facebook, as well as countless smaller companies. It has been responsible for many of the recent advances in areas such as automatic language translation, image classification, and conversational interfaces.
We haven’t gotten to the point where there is a single dominant deep learning framework. TensorFlow (Google) is very good, but has been hard to learn and use. Also TensorFlow’s dataflow graphs have been difficult to debug, which is why the TensorFlow project has been working on eager execution and the TensorFlow debugger. TensorFlow used to lack a decent high-level API for creating models; now it has three of them, including a bespoke version of Keras.
ASP.Net Web API is a lightweight framework that can be used for building RESTful HTTP services. When working with controller methods in Web API, you will often need to pass parameters to those methods. A “parameter” here simply refers to the argument to a method, while “parameter binding” refers to the process of setting values to the parameters of the Web API methods.
Note that there are two ways in which Web API can bind parameters: model binding and formatters. Model binding is used to read from the query string, while formatters are used to read from the request body. You can also use type converters to enable Web API to treat a class as a simple type and then bind the parameter from the URI. To do this, you would need to create a custom TypeConverter. You can also create a custom model binder by implementing the IModelBinder interface in your class and then implementing the BindModel method. For more on type converters and model binders, take a look at this Microsoft documentation.
According to a recent report from IDC, “worldwide revenues for big data and business analytics will grow from nearly $122 billion in 2015 to more than $187 billion in 2019, an increase of more than 50 percent over the five-year forecast period.”
Anyone in enterprise IT already knows that big data is a big deal. If you can manage and analyze massive amounts of data—I’m talking petabytes—you’ll have access to all sorts of information that will help you run your business better.[ The essentials from InfoWorld: What is big data analytics? Everything you need to know • What is data mining? How analytics uncovers insights. | Go deep into analytics and big data with the InfoWorld Big Data and Analytics Report newsletter. ]
Right? Sadly, for most enterprises, no.
One key devops best practice is instrumenting a continuous integration/continuousdelivery (CI/CD) pipeline that automates the process of building software, packaging applications, deploying them to target environments, and instrumenting service calls to enable the application. This automation requires scripting individual procedures and orchestrating the steps from code checkin to running application. Once matured, devops teams use the automation to drive process change and strive to do smaller, more frequent deployments that deliver new functionality to users and improve quality.
Sebastian Stadil is the CEO and founder of Scalr.
Enterprises are moving to multicloud in droves. Why? The key drivers most often cited by cloud adopters are speed, agility, platform flexibility, and reduced costs—or at least more predictable costs. It’s ironic then that more than half of these companies say that runaway cloud costs are their biggest postmigration pain point.
Sebastian Stadil is the CEO and founder of Scalr.
Enterprises are moving to multi-cloud in droves. Why? The key drivers most often cited by cloud adopters are speed, agility, platform flexibility, and reduced costs—or at least more predictable costs. It’s ironic then that more than half of these companies say that runaway cloud costs are their biggest post-migration pain point.
The power of Docker images is that they’re lightweight and portable—they can be moved freely between systems. You can easily create a set of standard images, store them in a repository on your network, and share them throughout your organization. Or you could turn to Docker Inc., which has created various mechanisms for sharing Docker container images in public and private.
The most prominent among these is Docker Hub, the company’s public exchange for container images. Many open source projects provide official versions of their Docker images there, making it a convenient starting point for creating new containers by building on existing ones, or just obtaining stock versions of containers to spin up a project quickly. And you get one private Docker Hub repository of your own for free.
I hear it every day now: “We’re moving beyond cloud computing to edge computing.” Pretty hypey, and not at all logical.
Edge computing is a handy trick. It’s the ability to place processing and data retention at a system that’s closer to the target system it’s collecting data for as well as to provide autonomous processing.[ What is cloud computing? Everything you need to know now. | Also: InfoWorld’s David Linthicum explains what exactly is edge computing. ]
The architectural advantages are plenty, including not having to transmit all the data to the back-end systems—typical in the cloud—for processing. This reduces latency and can provide better security and reliability as well.
The Xamarin acquisition was one of Microsoft’s smartest deals. It quickly gave it access to tools that let developers use familiar tools and technologies to build cross-platform applications. Now built into every version of Visual Studio, and providing the basis for its MacOS Visual Studio release, Xamarin has become a key element of Microsoft’s development tools.
Until recently—even with Xamarin—building cross-platform applications wasn’t easy. For all that the core development tools handle working with iOS and Android from .Net, using it to build apps meant having significant amounts of device-specific code to handle both native UX and deep platform integration. Although you could keep your core code across device-specific projects, building and testing the full application required domain knowledge and specialized skills. The result was code that, although a little cheaper than using native tools for each platform, really wasn’t as cheap to build as it could have been.
If you have experience building ASP.Net applications, you are undoubtedly familiar with role-based authorization. In ASP.Net Core – Microsoft’s lean and modular framework that can be used to build modern-day web applications on Windows, Linux, or MacOS – we have an additional option.
Policy-based authorization is a new feature introduced in ASP.Net Core that allows you to implement a loosely coupled security model. In this article I will explain what policy-based authorization is all about and how we can implement it in ASP.Net Core.[ Get started with Visual Studio Code, Microsoft’s lightweight editor for Windows, MacOS, and Linux. • Learn what’s new in the latest version of Visual Studio Code. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
Assuming that you have .Net Core installed in your system, follow the steps below to create a new ASP.Net Core project in Visual Studio 2017.
It’s the kind of meta notion that makes undergraduate philosophers say, “Whoa!” Software today is so complicated that we need to write software to help us understand and construct the software we need to write. Code begets code begets more code…
The code repository named Git is everyone’s favorite tool for curating software, but even this neat open source software isn’t enough. Most programmers and the teams to which they belong are now wedded to online versions of Git that add many extra layers of analysis and presentation to make it possible to wade through the vast swamp that is our code.[ Git essentials: Get started with Git version control. • 20 essential pointers for Git and GitHub. | Keep up with hot topics in programming with InfoWorld’s App Dev Report newsletter. ]
There are three big contenders now for the best place to stash your regular expressions, anonymous functions, and intense recursive tree-walking flashes of genius: GitHub, Bitbucket, and GitLab. All of them are competing to be the best place for you to store your source.