Tuesday, August 2, 2016

Agile Digital Transformation


The need for speed in application development is driving business today more than ever. Agile is the engine powering app development as user expectations increase. Whether creating mobile apps or high performance websites, businesses are being forced to rethink the way their entire software development process operates. The methodologies and tools that help development and delivery of quality software fast are more important than ever before. A great article by Christina Mulligan from SD Times, “Driving a Digital Transformation” discusses the forces driving agile business application development today.

Friday, July 1, 2016

Magic Quadrant for Business Intelligence and Analytics Platforms

According to a recently published Gartner report, by 2018:
  • Most business users and analysts in organizations will have access to self-service tools to prepare data for analysis as part of the shift to deploying modern BI platforms.
  • Most stand-alone self-service data preparation offerings will either have expanded into end-to-end analytical platforms or been integrated as features of existing analytics platforms.
  • Smart, governed, Hadoop-based, search-based and visual-based data discovery will converge in a single form of next-generation data discovery that will include self-service data preparation and natural-language generation.
During the past several years, the balance of power for business intelligence (BI) and analytics platform buying decisions has gradually shifted from IT to the business as the long-standing BI requirement for centrally provisioned, highly governed and scalable system-of-record reporting has been counterbalanced by the need for analytical agility and business user autonomy.

The evolution and sophistication of the self-service data preparation and data discovery capabilities available in the market has shifted the focus of buyers in the BI and analytics platform market — toward easy-to-use tools that support a full range of analytic workflow capabilities and do not require significant involvement from IT to predefine data models upfront as a prerequisite to analysis.
The shift in the BI and analytics market and the corresponding opportunity that it has created for new and innovative approaches to BI has drawn considerable attention from a diverse range of vendors. The list spans from large technology players — both those new to the space as well as longtime players trying to reinvent themselves to regain relevance — to startups backed by enormous amounts of venture capital from private equity firms. A crowded market with many new entrants, rapid evolution and constant innovation creates a difficult environment for vendors to differentiate their offerings from the competition. However, these market conditions also create an opportunity for buyers to be at the leading edge of new technology innovations in BI and analytics and to invest in new products that are better suited for Mode 2 of a bimodal delivery model than their predecessors.
Gartner's position is that organizations should initiate new BI and analytics projects using a modern platform that supports a Mode 2 delivery model, in order to take advantage of market innovation and to foster collaboration between IT and the business through an agile and iterative approach to solution development. The vendors featured in this year's Magic Quadrant present modern approaches to promoting production-ready content from Mode 2 to Mode 1, offering far greater agility than traditional top-down, IT-led initiatives — and resulting in governed analytic content that is more widely adopted by business users that are active participants in the development process. As the ability to promote user-generated content to enterprise-ready governed content improves, so it is likely that, over time, many organizations will eventually reduce the size of their enterprise system-of-record reporting platforms in favor of those that offer greater agility and deeper analytical insight.
Source: Gartner, 2016

Friday, June 17, 2016

Data Science Methodology: Best Practices for Successful Implementations

Marian University will be offering a degree program in Business Analytics starting Fall 2016. Many of our new courses draw heavily from Data Science. In the domain of Data Science, solving problems and answering questions through data analysis is standard practice. Often, data scientists construct a model to predict outcomes or discover underlying patterns, with the goal of gaining insights. Organizations can then use these insights to take actions that ideally improve future outcomes.

The flow of methodology illustrates the iterative nature of the problem-solving process. As data scientists learn more about the data and the modeling, they frequently return to a previous stage to make adjustments, iterate quickly and provide continuous value to the organization. Models are not created once, deployed and left in place as is; instead, are continually improved and adapted to evolving conditions.

You can download the White Paper here: https://form.jotformeu.com/61126366826357

Sources: Data Science Central, IBM



Friday, May 13, 2016

Hyperconnectivity: Everyone has access to everything

According to “Digital Disruption: How digital technology is transforming our world” (SAP, 2016). Hyperconnectivity describes the rapid growth of interconnectedness between people, objects, and technology. People have access to more data and information than ever before, and they have the ability to develop and maintain more social and business connections across the globe than at any other time in history. Hyperconnectivity has come about thanks to three phases of technological advancement:

1. The Internet enables all computers to connect on a single platform. Without the Internet, we wouldn’t be digitally connected.
2. Mobile Technology continues to grow exponentially, which accelerates hyperconnectivity worldwide.
3. The Internet of Things is the next wave of advancement, and will take interconnectedness to an unprecedented level. Within the next three years, the data created by IoT devices will reach 403 trillion gigabytes annually.

With mobile technology at an all-time high and the Internet of Things on the upsurge, the economic impact of hyperconnectivity on gross domestic product (GDP) is rising. A recent study using data from the World Economic Forum and other sources concluded that in the G20 countries, the GDP related to the Internet or digital activity is worth approximately $4.2 trillion U.S., roughly 3.5 times more than what the oil and gas industry generates. Growth is at about 8% in developed countries; growth in developing countries is escalating even faster, at roughly 18%.

For business, hyperconnectivity means quicker, easier global connections and innovation throughout the entire business ecosystem in today’s networked economy. In a global survey of 561 executives by The Economist Intelligence Unit the data revealed:

• 69% of business leaders believe their companies are adapting well to hyperconnectivity
• 47% say business processes have accelerated as a result
• 45% say collaboration, both within and between divisions, has improved because of hyperconnectivity
• Only 33% say it presents more threats than opportunities

In addition, (Capgemini, 2015) finds that 64% of executives say Big Data is changing traditional business boundaries and enabling nontraditional providers to move into their industry. Over 50% expect to face increased competition from start-ups enabled by data and 24% of companies report disruption from new competitors moving into their industry 

More than a technological trend, hyperconnectivity is a cultural condition to which businesses must adapt. Today’s businesses operate in a digital world – one that’s faster and more volatile than ever before. Those companies that are willing to radically transform and disrupt their memory computing and digital accelerators like social, mobile analytics, and cloud computing, can create faster, smarter, more agile, and better connected enterprises. Companies that are quick to adopt digital technology and become disruptors in their industry are reaping rewards such as:

• Quicker time to market
• Increased customer satisfaction
• Competitive advantage

Tuesday, January 19, 2016

Seagate Launches 10TB Helium-Filled Hard Drive

According to an article in eWeek, Seagate recently unveiled its highest capacity ever enterprise hard drive, a 10TB helium-filled design, that competes directly with similar drives manufactured by HGST and Samsung. Because it is filled with helium instead of air, the 3.5-inch Seagate Enterprise Capacity HDD has less drag on its internal components, enabling them to run cooler and with less power than standard HDDs.

Samsung currently owns the capacity record with a 16TB hard disk it introduced last fall, but it lists for $7,000 and probably isn't going to be high on many wish lists until the price comes down by at least 50 percent. HGST may be the furthest ahead of the group, since its 10TB helium drive came out in September 2014. Seagate is aiming the new drive at cloud-based data storage needs.
The 10TB HDD uses the standard 3.5-inch disk design and incorporates seven platters and 14 heads. Seagate said the drive features the industry's lowest power/TB ratio and weight specifications for a 10TB HDD. This breaks down to 25 percent more density to help businesses increase the number of petabytes per rack, the company said.

If anyone remembers the PC’s of the 1980’s it was common to have 10-40MB of hard drive storage available. My dad had an IBM PC/AT with a 20 MB hard drive which was "handed down" to me when I was in grade school. The drive currently sits disassembled in a box in my office. I checked the date on it and it was manufactured in June 1985! For comparison, this new drive represents a 1,000,000 times increase in storage capacity over the earliest hard drives.

Monday, December 28, 2015

Managing Technical Debt

Everyone knows they must manage their financial debt to be successful in business. Managing “Technical Debt” is another matter. Far from an imaginary concept, “tech debt” is real. Almost every project incurs technical debt over time. Typically during the development and production phases. Unrealistic expectations, "scope creep," stakeholder demands, or poor project management can encourage software development teams to sacrifice quality or take shortcuts. While some “tech debt” can speed the development process, unless it is “paid back” with a revision, the longer the debt is allowed to accrue the more difficult and expensive it becomes to fix. Just like financial debt, companies either must address it or ignore it at their own peril. Eventually the “loan comes due.” Steven Rabin writing in the December issue of SD Times presents an excellent case for managing your technical debt. Highly recommended reading.

Tuesday, October 13, 2015

Will Nuclear Fusion Change Our World?

What if our world was powered by a cheap, safe, clean, virtually limitless, sustainable fuel source such as water? If energy is cheap and available to all nations, that could reduce global political tensions. And if it comes from a clean-burning fuel source, that reduces air pollution. Billionaires such as Amazon founder Jeff Bezos, PayPal co-founder Peter Thiel and Microsoft co-founder Paul Allen have been big supporters or research into fusion as an energy source.

"What we're really doing here is trying to build a star on Earth," said Laban Coblentz at the International Thermonuclear Experimental Reactor (ITER), a massive fusion reactor being built by 35 countries in southern France. Fusion is what keeps stars, including our own sun, burning bright. Nuclear Fusion is also the principle behind how a hydrogen bomb works.

As a source of energy, fusion works like this:

You take two gases called deuterium and tritium and you heat them under pressure to at least 100 million degrees Celsius. That's 180 million degrees Fahrenheit. These substances will get so hot that they change from gas to plasma. Then they fuse together releasing a burst of additional heat. That burst is called a fusion reaction. The heat boils water into steam, which drives a turbine and generates electricity that powers your neighborhood. To be commercially viable, you have to create more energy than the original energy you used to heat the fuel but so far we haven't been able to figure that part out. Yet.

You can read more about nuclear fusion in this article by Thom Patterson of CNN.