“SAAS over PAAS over IAAS over my own data centre” – some thoughts

CATEGORIES

Blog

This is a headline from a quote which came from my friend, a C-level executive in one of the biggest corporations in Finland. You might wonder why, as a representative of a company that doesn’t particularly preach cloud*, and doesn’t provide a full set of services for managing key SaaS components like ERPs or CRMs, I would bring up the topic.

Naturally, I want to talk about the role of custom-built software running on PaaS. I agree that when SaaS does the trick, you should choose it over a custom PaaS-application.

Let’s start with the easy part: Why is PaaS superior to IaaS?

There are a number of reasons:

  • Capacity is dirt cheap
  • Labour is expensive
  • Capacity is getting cheaper rapidly
  • Labour costs are increasing.

You should focus on saving labour, and this is fundamentally what using PaaS means. Your software development projects will ultimately end up costing less. Finding the cheapest possible capacity should save you a couple of hundred euros/pounds/dollars per month. On the other hand, finding the most efficient development platform saves you months in development costs and crucially, in time to market.

You might find this irrelevant if you want to use SaaS for everything, but unfortunately, you can’t. There might be a service for each of your needs, but how are you going to win overall in the marketplace?

Every company today runs their business processes digitally. When you use SaaS, you’re essentially using the process the vendor has defined, which in a lot of cases are world-class. Using these processes ensures you’re competitive, but by definition, they are not unique and you can’t build competitive advantage with them. When you look at your strategy and understand how are you going to be different from your competitors, you’ll understand when and where you’re going to need PaaS.

For me, this makes PaaS somewhat superior to SaaS. The things we develop run at the core of our customers’ strategy. We make a real impact. Ultimately, a partner such as Nordcloud will help you win.

There’s enough market demand with the believers to spend time on educating the non-believers (!)

Blog

Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

Blog

Building better SaaS products with UX Writing (Part 3)

UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

Blog

Building better SaaS products with UX Writing (Part 2)

The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

Get in Touch

Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








    Day 2 at Re:Invent – Builders & Musicians Come Together

    CATEGORIES

    Blog

    When Werner Vogels makes bold statements, expectations are set high. So when Vogel’s tweeted 15 minutes before the start of re:Invent’s day 2 keynote, we had to wonder what was coming.

    And how right we were. The close to 3 hours spent in the Venetian hotel in Las Vegas was an experience in itself.

    Andy Jassy opened the keynote with a long list of customers and partners, alongside the latest business figures. AWS are currently running at an 18 billion run rate with an incredible 42% YoY growth. With millions of active customers – defined as accounts that have used AWS in the last 30 days – the platform is by far the most used on the planet.

    As per Gartner’s 2016 Worldwide Market Segment Share analysis, the company (successfully led by Jassy), has achieved a 44.1% market share in 2016, up from 39% in 2015, more than everyone else combined. This became easily noticeable when AWS displayed an entire catalogue of new services throughout the keynote. The general stance Jassy took this year was that AWS are trying to serve their customers exactly what they asked for in terms of new products. The mission of AWS is nothing short of fixing the IT industry in favour of the end-users and customers.

    The first on stage was a live ‘house’ band, performing a segment of ‘Everything is Everything’ by Lauryn Hill, the chorus rhyming with ‘after winter must come spring’. Presumably, AWS was referring to the world of IT still being in a kind of eternal ‘winter’. The concept we also heard here was that AWS would not stop building their portfolio and that they want to offer all the tools their ‘builders’ and customers need.

    AWS used Jassy’s keynote for some big announcements (of course, set to music), with themes across the following areas:

    • Compute
    • Database
    • Data Analytics
    • Machine Learning and
    • IoT

    The Compute Revolution Goes On

    Starting in the compute services area, an overview of the vast number of compute instance types and families were shown, with special emphasis given to the Elastic GPU options. There were a few announcements also made on the Tuesday night, including Bare Metal InstancesStreamlined Access to Spot Capacity & Hibernationmaking it easier for you to get up to 90% of savings on normal pricing. There was also M5 instances which offer better-priced performance than their predecessors, and H1 instances offering fast and dense storage for Big Data applications.

    However, with the arrival of Kubernetes in the industry, it was the release of the Elastic Kubernetes that was the most eagerly anticipated. Not only have AWS recognised that their customers wanted Kubernetes on AWS, but they also realise that there’s a lot of manual labour involved in maintaining and managing the servers that run ECS & EKS.

    To solve this particular problem, AWS announced AWS Fargate, a fully managed service for both ECS & EKS meaning no more server management and therefore increasing the ROI in running containers on the platform. This is available for ECS now and will be available for EKS in early 2018.

    Having started with servers and containers, Jassy then moved on to the next logical evolution of infrastructure services: Serverless. With a 300% usage growth, it’s fair to say that if you’re not running something on Lambda yet, you will be soon. Jassy reiterated that AWS are building services that integrate with the rest of the AWS platform to ensure that builders don’t have to compromise. They want to make progress and get things done fast. Ultimately, this is what AWS compute will mean to the world: faster results. Look out for a dedicated EKS blog post coming soon!

    Database Freedom

    The next section of the keynote must have had some of AWS’s lawyers on the edge of their seats, and also the founder of a certain database vendor… AWS seem to have a clear goal to put an end to the historically painful ‘lock-in’ some customers experience, referring frequently to ‘database freedom’. There’s a lot of cool things happening with databases at the moment, and many of the great services and solutions shown at re:Invent are built using AWS database services. Out of all of these, Aurora is by far growing the fastest, and actually is the fastest growing service in the entire history of AWS.

    People love Aurora because it can scale out for millions of reads per second. It can also autoscale new read replicas and offers seamless recovery from reading replica failures. People want to be able to do this faster, which is why AWS launched a new Aurora features, Auto Multi-Master. This allows for zero application downtime due to any write node failure (previously, AWS suggested this took around 30 seconds), and zero downtime due to an availability zone failure. During 2018 AWS will also introduce the ability to have multi-region masters – this will allow customers to easily scale their applications across regions have a single, consistent data source.

    Lastly, and certainly not least, was the announcement of Aurora Serverless. which is an on-demand, auto-scaling, Serverless version of Aurora. The users pay by the second – an unbelievably powerful feature for many use cases.

    Finally, Jassy turned its focus point to DynamoDB service, which scaled to ~12.9 million requests per second at its peak during the last Amazon Prime Day. Just let that sink in for a moment! The DynamoDB service is used by a huge number of major global companies, powering mission-critical workloads of all kinds. The reason for this is, from our perspective, is the fact that it’s very easy to access and use as a service. What was announced today was the new feature DynamoDB Global Tables. This enables users to build high performance, globally distributed applications.

    The final database feature released for DynamoDB was managed back-up & restore, allowing for on-demand backups, point-in-time recovery (in the past 35 days), allowing backups for data archival or regulatory requirements to be taken of hundreds of TB with no interruption.

    Jassy wrapped up the database section of his keynote by announcing Amazon Neptune, a fully managed graph database which will make it easy to build and run applications that work with highly connected data sets.

    Analytics

    Next Jassy turned to Analytics, commenting that people want to be using S3 as their data lake. Athena allows for easy querying of structured data within S3, however, most analytics jobs involve processing only a subset of the data stored within S3 objects and Athena requires the whole object to the processed. To ease the pain, AWS released S3 Select – allowing for applications, (including Athena) to retrieve a subset of data from an S3 object using simple SQL expressions – AWS claim drastic performance increases – possibly up to 400% performance.

    Many of our customers are required by regulation to store logs for up to 7 years and as such ship them to Glacier to reduce the cost of storage. This becomes problematic if you need to query this data though. How great would it be if this could become part of your data lake? Jassy asked, before announcing Glacier Select. Glacier Select allows for queries to be run directly on data stored in Glacier, extending your data lake into Glacier while reducing your storage costs.

    Machine Learning

    The house band introduced Machine Learning with ‘Let it Rain’ from Eric Clapton. Dr Matt Woods made an appearance and highlighted how important machine learning is to Amazon itself. The company uses a lot of it, from personal recommendations on Amazon.com to Fulfillment automation & inventory in its warehouses.

    Jassy highlighted that AWS only invests in building technology that its customers need, (and, remember Amazon.com is a customer!) not because it is cool, or it is funky. Jassy described three tiers of Machine Learning: Frameworks and Interfaces, Platform Services & Application Services.

    At the Frameworks and Interfaces tier emphasis was placed on the broad range of frameworks that could be used on AWS, recognising that one shoe does not fit every foot and the best results come when using the correct tool for the job. Moving to the Platform Services tier, Jassy highlighted that most companies do not have to expect machine learning practitioners (yet) – it is after all a complex beast. To make this easy for developers, Amazon SageMaker was announced – a fully-managed service that enables data scientists and developers to quickly and easily build, train, and deploy machine learning models at any scale.

    Also at the platform tier, AWS launched DeepLens, a deep learning enabled wireless video camera designed to help developers grow their machine learning skills. This integrates directly with SageMaker giving developers an end-to-end solution to learn, develop and test machine learning applications. DeepLens will ship in early 2018, available on Amazon.com for $249.

    The machine learning announcements did not stop there! As Jassy moved into the Application Services tier AWS launched:

    IoT

    Finally, Jassy turned to IoT – identifying five ‘frontiers’ each with its own release, either available now, or in early 2018:

    1. Getting into the game – IoT One Click (in Preview) will make it easy for simple devices to trigger AWS Lambda functions that execute a specific action.
    2. Device Management – AWS IoT Device Management will provide fleet management of connected devices, including the onboarding, organisation, monitor and remote management through a devices lifetime.
    3. IoT Security – AWS IoT Device Defender (early 2018) will provide security management to your fleet of IoT devices, including auditing to ensure your fleet meets best practice.
    4. IoT Analytics – AWS IoT Analytics, making it easy to cleanse, process, enrich, store, and analyze IoT data at scale.
    5. Smaller Devices – Amazon FreeRTOS, an operating system for microcontrollers.

    Over the next weeks and days, the Nordcloud team will be diving deeper into these new announcements, (including our first thoughts after getting our hands on the new releases) We’ll also publish our thoughts and how they can benefit you.

    It should be noted that, compared to previous years, AWS are announcing more outside the keynotes, in sessions and on their Twitch Channel and so there are many new releases which are not gaining the attention they might deserve. Examples include T2 UnlimitedInter-Region VPC Peering and Launch Templates for EC2 – as always the best place to keep up-to-date is the AWS ‘whats new‘ page.

    If you would like to discuss how any of today’s announcements could benefit your business, please get in touch.

    Blog

    Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

    When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

    Blog

    Building better SaaS products with UX Writing (Part 3)

    UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

    Blog

    Building better SaaS products with UX Writing (Part 2)

    The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

    Get in Touch

    Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








      What is data-driven service design?

      CATEGORIES

      Blog

      For a few years now, I’ve been engaged in a personal passion project of explicating what the increasing abundance of data can do for design. My most recent definition of data-driven design is that it means digitalisation and automation of design research. In future, data-driven design will possibly reach out to decision making and generative design. But we’re not there yet.

      As I’ve written over the years about the concept and tools of data-driven design, my musing around the topic has been somewhat limited. As I’m operating in a digital design and development company context, design has referred to interaction design: user interface design decisions and how to best implement certain features.

      I have left several design domains with little attention. In this article, I will venture a bit beyond my home turf. I’ll change the question and think about what should we build, instead of how we create it. This question takes a step to a higher abstraction level, that commonly associated with service design. In the following, I’ll consider what the big data world could offer for service design.

      Why → What → How
      (business design → service design → interface design)

      What good can more data do for service design?

      Service design is a bit of niche area of its own originating from 1980’s. Starting from the design of banking services, it has since slowly grown to be recognised profession serving the development of many physical touch points. But nowadays professionals calling themselves service designers also regularly deal with digital touch points.

      In the few visual depictions of what is the overall field of design visualised below, service design is totally missing from the left one (based Dan Saffer) illustrating UX design and occupies a small segment of human-centred design. But I assure you, it still exists, even thou it is clearly far out of the spotlight of more recent disciplines of digital design.

      How about the use of data in this domain? The public examples of data-driven service design are rare. For instance, the global Service Design Network chapter Netherlands was apparently among the first to host a session specifically aimed at sharing experiences with data in service design.

      The short story written about the data-driven service design event gives an opinion I can readily agree with: quantitative data must complement, challenge and give a foundation for qualitative data.

      Service design requires a mix of research inputs

      The long-term experience design specialist Kerry Bodine puts it as “service design requires a mix of research inputs.” She has expressed a great concern of over-reliance on big data methods without the complementary qualitative insights. This relationship has been previously highlighted by Pamela Pavliscak under the terms big and thick data, in order to highlight their contemporary nature.

      In other words, data-driven design means using more data, particularly quantitative, in the design process.

      A side note: a term that may seem relevant to data-driven service design is service analytics. Service analytics, in my opinion, are a subset of traditional analytics areas: web analytics, market intelligence, and business intelligence. For instance, in Sumeet Wadhwa’s article on the topic, service analytics are presented foremost as a tool quantify, track and manage service design efforts, not so much inspire or help to find new design opportunities. Thus they are not a creative driver for the design process.

      “Data” for transformational design is embodied in designers, not the customer

      Data can’t solve or even easily be used to support all design decisions.  Given that people are naturally resistant to change, defending any major change using backward-looking data is not going be easy. In a recent post, frog founder Hartmut Esslinger provided strong criticism for misinterpretations of “big” data.

      His examples very neatly illustrate conservative interpretation bias of data. For instance, in a 2001 Motorola case, the company discarded a touchscreen smartphone concept (later known as the iPhone) because market intelligence data clearly showed people wanted to buy phones akin to those designed by Nokia! Clearly, the data-based insight was inferior to a “designer-based” insight about what you should create.

      Solving this challenge is not easy. I’ve personally helped to articulate one user acceptance testing approach called resonance testing originating from American design company Continuum. This method presents a quite specific procedure to investigate quantitatively consumers reactions to ‘what’ questions. However, this method is dependent upon face-to-face interactions and does not thus really fall within the domain of data-driven design as defined at the start.

      Tools for data-driven service design

      The data-driven or data-informed design does not identify any particular design approach. However, I see that it requires a certain prototypical process to support it. First and foremost, it always requires real data. Representative data must be collected, analysed, inferences made and brought to bear upon design decisions and new designs.

      Data-driven design always requires real data

      What kind of data and which tools of analysis will help service designers to decide what needs to be created? In my previous writing, I’ve proposed a taxonomy of the different types of tools available for data-driven design. Starting from there, we can observe that we have three categories of tools that hold a promise in this direction. They are active data collection solutions, user recordings, and heat maps.

      Once more the origin of these tools is within the digital domain, in the web and mobile apps, but it is more important to bear in mind that they are very heavily related to the foremost revision or assessment of existing features. They can give a glimpse of what else your customers might love, what they fail to achieve or which part of service they neglect.

      Passive records from use sessions on digital or physical touch points can be revealing, but active data collection – from co-design to all manners of classical qualitative research has been the core of service design research. But are there any qualitative research methods that can scale, to provide the automation aspect I attach to data-driven design approach?

      Different types of surveys naturally scale well. Especially digital environments offer unprecedented opportunities to target and trigger surveys, making them much more powerful than they were in the past.  Of course, they are limited by the structure of their insight. But free, open-ended can be very intuitive and applicable in data-driven design if we can also provide the tools that automate the analysis of the inputs, not just collection. Sentiment analysis alone, as criticised by Boden above, is a weak method. Segmentation and automated summaries can add value to aggregate figures alone. This is bit futuristic but already feasible (see also Zendesk’s approach to data in automating customer service).

      Insights from the local industry insiders

      I had a chance to talk with Petteri Hertto, a long-term specialist in quantitative research, about the topic. He is a service designer currently working at Palmu agency in Helsinki, Finland. He says that too many projects feel obliged to gather quantitative data without good reasons. They end up with data that is non-actionable from a design point of view.

      Petteri has personally transformed from a quantitative data specialist to a designer that sees value in both types of data. “The best uses of quantitative data lie in proofing new ideas and verifying a business case around it,” he believes. Petteri has documented a model of value measurement his agency prefers in a Touchpoint article (Touchpoint magazine is the journal published by Service Design Network).

      Are there any new tools specifically for data-driven service design?

      I further pressed Petteri on whether any (quantitative) design research tools have appeared in the past 10 years that would resemble my definition of data-driven design.

      He recounted that there are few radically new developments. In the design approach favoured by their agency, they use the same tools as UX designers, including those data-intensive ones. However, he named one novel survey tool made possible by mobile technologies. It addresses several deficiencies of validity in traditional research.

      Crowst is a Finnish startup which provides surveys targeted on verified user behaviour in the physical world, improving the quality of input.

      Then again, this is an incremental improvement over existing tools, not a radically novel approach with unforeseen data masses, new level of insight or scalability.

      Can data reveal what the customer needs?

      Are we back to square one in terms of answering the question of what does the customer want? Yes and no. I believe a thoughtful analysis of big data can serve three purposes in service design:

      1. Identify opportunities for new experiences & features
      2. Inspire solution creation
      3. Validate solutions*

      * difficult to validate without a detailed implementation and answering the how question

      However, the data about yesterday can’t really tell us what is going to happen tomorrow. We have to more or less make the future available today through scenarios and prototypes which can generate the data that illustrates the future.

      Recap

      Data-driven design in user interface level is in good speed, but the need for qualitative insight still dominates service design. Contemporary service designs acknowledge the potential – and danger – in big data, but the tools to transform the potential into a revolution in the ways of working is still missing.

      It is evident service designers must be comfortable with working with data as big as it comes. However, ready-made tools and methods are far fewer than in user interface design. Answering the fundamental question “what to design” is notoriously difficult with data that describes things of the past.

      I believe it is and will be possible even to a greater extent than we can today imagine in a couple of years. Join the revolution today!

      Blog

      Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

      When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

      Blog

      Building better SaaS products with UX Writing (Part 3)

      UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

      Blog

      Building better SaaS products with UX Writing (Part 2)

      The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

      Get in Touch

      Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.








        How to build a business case for the cloud in financial services

        CATEGORIES

        Blog

        The financial services sector is well known for its caution surrounding technological change, due to its history of regulations, legacy and money handling. However, with constant pressure on IT budgets and the need for digital innovation, financial services firms can no longer ignore the benefits the public cloud have to offer.

        In a recent survey, 74 percent of banks report that the cloud will become a major factor in the industry within the next five years. Why? Because the cloud offers the financial services a cheaper, more scalable and agile way to improve their business operations without having to compromise security or compliance.

        In this blog post, we’ll discuss the business case for moving your financial business to the cloud. Here are four benefits that may help solidify your decision to embrace digital transformation:

        1. Cost Reductions:

        In today’s unsteady economy, financial services are not just looking for ways to cope with cost pressures; they’re looking to invest in innovative technology to help them:

        • Compete with FinTech insurgents
        • Improve operational efficiency
        • Accelerate modelling and analysis
        • Serve customers better
        • Differentiate themselves in the market with innovative services

        It’s no secret that well-architected environments in the cloud are a cheaper alternative to co-location or on-premise solutions. In fact, 88 percent of financial institutions believe that the reduction in TCO (total cost of ownership) is the biggest benefit of a cloud-based infrastructure. By embracing the cloud, money that was once spent on capital infrastructure and keeping servers cool can instead be used to focus on operations and opportunities that matter to individual businesses and their customers.

         

        2. Integrated Security:

        Although traditional data centres offer businesses a physical sense of security, they aren’t always the best long-term or most economic solution. For financial services that are planning on scaling their business and using new, innovative technologies, moving some of your infrastructure to the cloud is almost inevitable.

        It’s important to note that this move does not make your data less secure. Despite the security myth surrounding cloud technology, your sensitive information will remain safe on the cloud, providing your financial business has a clear and strictly implemented security policy. Your level of security also depends on your cloud service provider.

        Here are some examples of security features your provider can offer:

        • Access control that allows you to choose who in your business can, or cannot, access sensitive data. For instance, Azure Active Directory helps ensure that only authorised individuals can access your applications, environments and data.
        • Behaviour analytics that can detect threats and anomalies and report any unusual behaviour or unauthorised access.
        • Integrated security across all of your business’ applications, meaning that employees are safe to work from anywhere.
        • Continuous monitoring of your servers, applications and networks to detect threats. Businesses also have the option to deploy third-party security solutions within their cloud environment, such as firewalls and anti-malware.
        • Physical security. For example, Azure’s regional data centres include fencing, CCTV and security teams.
        • Shared security with your cloud service provider. In order to keep your data as secure as possible, your business needs to take responsibility for your own security practices, in regards to your applications and employees.

        3. Quick & Economic Scalability:

        Growth is important to all businesses, particularly when you’re handling new customers and data on a daily basis. With the cloud, companies can scale and deploy releases quickly and continuously (either automatically or manually) according to demand, or reduce resources if needed. This means that you only need to pay for the platforms that you actively use across your business or IT infrastructure.

        For example, an insurance firm may need to run complex risk models to respond to market changes. Doing these calculations in the cloud lets them access hundreds or even thousands of processors to complete the modelling quickly. Yet, this performance isn’t required all the time so delivering this kind of high-performance computing cluster in-house is prohibitively expensive.

        4. Big Data, automation, and analytics:

        Storing large amounts of data isn’t much use if you don’t know how to handle it. Unlike traditional storage solutions, the cloud delivers a better, cheaper and more personable approach to big data and analytics. The intelligent insights you can harness allow you to deliver the best internal and customer engagement actions for your business.

        Big data and integrated customer relationship management tools can allow you to:

        • Gain a 360-degree view of customer information and profiles. This can allow financial services to gain a better insight into customer trends and risks, as well as offer an opportunity to improve customer services.
        • Analyse transactions and operations quickly without having to manually look through masses of documents and memos. Ultimately, this allows the financial services to gain better insight into market trends, provide better customer service and help to automate some time-consuming workloads.
        • Access structured and unstructured data quickly across one integrated platform. Financial services are able to analyse this data to gain insights into regulatory risks across all disparate sources.

        Embracing Change

        Due to the regulations surrounding financial institutions, these organisations have historically been slower at adopting the cloud than other verticals.

        However, the competition is not going away. The financial services industry is being challenged by more digitally focused banks that are able to offer their services quicker and more efficiently. This is because they have built their infrastructure in the cloud.

        Using an experienced partner can help you to achieve the maximum benefits the cloud has to offer. So, if your financial business is ready to make the move to the cloud, contact us here. We’d love to hear from you.

        Blog

        Starter for 10: Meet Jonna Iljin, Nordcloud’s Head of Design

        When people start working with Nordcloud, they generally comment on 2 things. First, how friendly and knowledgeable everyone is. Second,...

        Blog

        Building better SaaS products with UX Writing (Part 3)

        UX writers are not omniscient, and it’s best for them to resist the temptation to work in isolation, just as...

        Blog

        Building better SaaS products with UX Writing (Part 2)

        The main purpose of UX writing is to ensure that the people who use any software have a positive experience.

        Get in Touch

        Let’s discuss how we can help with your cloud journey. Our experts are standing by to talk about your migration, modernisation, development and skills challenges.