Your address will show here +12 34 56 78
2020 Blog, Blog, Cloud Blog, Featured

Amazon WorkSpaces is a simple to use, cloud based, managed secure Desktop solution. It is a one click deployment product which is available on Windows and Linux operating systems. The main advantage of using Amazon WorkSpaces is as follows.

  • Easy to provision, Desktop as a Service (DaaS)
  • Provision, de-provision and lifecycle management using your existing ITSM (ServiceNow, Jira Service Desk or Freshservice)
  • Extend your existing On-Premise Desktops/Laptops with the AWS Workspaces and manage it centrally
  • Secured data with reliable, High Availability enabled Desktop solution
  • Cost effective and on-demand flexibility
  • Manage and scale up or scale down based on the business need in a centralized way
  • Accelerate deployment at scale

Need for a secured and effective Cloud End User Computing Model

Amazon WorkSpaces helps in adopting a secure, managed cloud-based virtual desktop model to fulfil your End User Computing (EUC) IT requirement needs. Also, it ensures Organizations move away from the pain of procurement, deployment, and management of a complex environment. The traditional method also has a challenge where the hardware and licenses can be scaled up with additional cost, in case of a need but cannot be scaled down and ends up with unwanted cost in case of seasonal spike. Amazon WorkSpaces help organizations scale up and scale down based on demand and deploy at scale with few click deployment models and with enhanced security of your cloud Desktop. Relevance Lab’s pre-baked solution helps your IT team who has minimal knowledge on AWS adopt DaaS solutions with usage of ITSM platforms or custom Cloud Portal.

Best Practices of Network design for Amazon WorkSpaces


VPC It is recommended to use a separate VPC for your WorkSpaces implementation. This helps us define the required governance and security guardrails by creating traffic separation.
Directory Service Each AWS Directory service build requires a pair of subnets for high availability across Amazon availability zones.
Subnet size Subnet sizes are permanent and cannot be modified and hence need to plan for future capacity. You can define a default security group to your directory services which implies it to all the WorkSpaces under this directory services. Additionally, you can have multiple directory services use the same subnet.
Network Connectivity Whether you are looking for a pure cloud solution for your AWS WorkSpaces or planning to integrate with your existing on-prem setup, AWS helps achieve both using multiple options as below.
Option 1 – Extend your existing directory to the AWS Cloud.
Option 2 – Utilize your existing on-premises Microsoft Active Directory by using AWS directory Service, AD Connector.
Option 3 – Integrate your on-premise server with AD Connector to provide multi-factor authentication (MFA) to your WorkSpaces.
Option 4 -Create a managed directory with AWS Directory Service, Microsoft AD or Simple AD, to manage your users and WorkSpaces.

Observability of AWS WorkSpaces

This deals with managing lifecycle from creation, usage and termination in an optimal manner. This covers following three areas.

  1. Security and Governance
  2. As per AWS best practices, every individual user account should be set up with AWS IAM roles with right permissions and enable multi-factor authentication (MFA) with each account. Different WorkSpaces on the same physical host are isolated from each other through the hypervisor as though they are on separate physical hosts.

  3. Health Monitoring
  4. CloudWatch Metrics for WorkSpaces gives an insight to the overall health and connection status of all WorkSpaces. This can be per Desktop or aggregated for all WorkSpaces within a Directory. Apart from the default metrics, you can also enable additional metrics.

  5. Cost Optimization
  6. AWS WorkSpaces billing is based on usage and there are 2 options to choose by default.

    • AlwaysOn – This is the best option when you are a monthly billing mode, and your usage is typically around 6 to 9 hours a day.
    • AutoStop – This is the ideal option when you are on hourly billing. You can have the WorkSpaces stop after a specified time of inactivity which stops the billing.

One of the best practices is to monitor the usage of the WorkSpaces running mode using Amazon WorkSpaces Cost Optimizer. This solution uses an Amazon CloudWatch event that invokes an AWS Lambda function every 24 hours. This can then convert your WorkSpaces to the most cost-effective model from the next billing cycle. (Hourly to Monthly or Monthly to Hourly) based on your usage pattern.



Automation

WorkSpaces provisioning can be automated using your existing ITSM platforms like ServiceNow, Jira, ServiceDesk or Freshservice. There are existing connectors like AWS Service Management Connector and RLCatalyst Service Management Connector providing end to end automation.


AWS Products Used


Relevance Lab is a specalist AWS partner for Desktop as a Service using AWS Workspaces. It has implemented Workspaces with its pre-integrated, secured and matured solutions for its clients using their existing ITSM tools. This has helped customers for a faster adoption of cloud and promoted the cost optimization journey. Relevance Lab’s DaaS solution offering starts with an assessment questionnaire that can help your organizations understand the need to migrate to a secured, scalable and matured solution. Based on the assessment scorecard, we recommend the right solution based on automation, security, governance and compliance model.

This blog refers to the standard Desktop as a Service using AWS Workspaces. In more advanced scenario’s adoption of DaaS also involves additional steps like Storage, Log Monitoring, Security Analytics (SIEM, SOAR), Mail and Office suite options, Container Deployment and Application security signing which will be covered in a separate blog.


For more details or for the assessment questionnaire please reach out to marketing@relevancelab.com



0

2020 Blog, Blog, Featured, RLCatalyst Blog, ServiceNow

Relevance Lab in partnership with ServiceNow and AWS has launched a new solution (ServiceNow scoped application) to consume Intelligent Automation BOTs from within ServiceNow self-service Portal with 1-Click automation of assets and service requests using the Information Technology Service Management (ITSM) governance framework. This RLCatalyst BOTs Service Management (RLCatalyst BSM) connector is available for private preview and will very soon be also available on ServiceNow Marketplace. It integrates with ServiceNow self-service Portal and Service Catalog to dynamically publish an enterprise library of BOTs for achieving end to end automation across infrastructure, applications, service delivery and Workflows. This solution builds on the concept of “Automation Service Bus” architecture explained in a blog earlier.

The biggest benefit of this solution is a transition to a “touchless” model for automation within ServiceNow Self Service Portal with a dynamic sync of enterprise automation libraries. It provides an ability to add new automation without a need to build custom forms or workflows inside ServiceNow. This makes creation, publishing and lifecycle management of BOTs automation within the existing governance models of ITSM and Cloud frictionless leading to faster rollout and ROI. Customers adopting this solution can optimize ServiceNow and Cloud operations costs significantly with self-service models. A typical large enterprise Service Desk team gets a huge volume of inbound tickets on a daily basis and more than 50% of these can be re-routed to self-service requests with a proper design of service catalog, automation and user training. With every ticket fulfilment cost (normally US $5-7) now handled by BOTs there is a significant and measurable ROI along with faster fulfilment, better user experience and system based compliance that helps in audits.

Following are the key highlights of this solution

  • Rendering of RLCatalyst BOTs under ServiceNow Service Catalog for 1-Click order and automation with built in workflow approval models.
  • Ability of ServiceNow Self Service users to order any Automated Service Request from this standard catalog covering common workflows like.
    • Password Reset Requests.
    • User Onboarding.
    • User Offboarding.
    • AD/SSO/IDAM integration.
    • Access and Control for apps, tools, and data.
    • G-Suite/O365/Exchange Workflows.
    • Installation of new software.
    • Any standard service request made available by enterprise IT in a standard catalog.
  • Security and approvals integrated with existing ServiceNow and AD user profiles.
  • Ability to involve any BOT from the RLCatalyst BOTs server that provides integration to agent base, agent-less, Lambda function, scripts, API based, UI based automation functionality.
  • A pre-built library of 100+ BOTs provided as out-of-the-box solution.

As a complementary solution to AWS Service Management connector customers can achieve complete automation for their Asset and Service Requests with Secure Governance. For assets being consumed on non AWS footprints like VMWare, Azure, On-prem systems, the solution supports automation with Terraform templates to address hybrid-cloud platforms.

What are BOTs?
Any Automation functionality dealing with common DevOps, TechOps, ServiceOps, SecurityOps and BusinessOps. BOTs follow an Intelligent Automation maturity model as explained in this blog earlier.

  • BOTs Intelligent Maturity Model
    • Task Automation.
    • Process Automation.
    • Decisioning Driven Automation.
    • AI/ML Based Automation.

BOTs vs Traditional Automation

  • BOTs are reusable – separation of Data and Logic.
  • BOTs support multiple models – AWS Lambda Functions, Scripts, Agent/Agentless, UIBOTs etc with better coverage.
  • BOTs are managed in a Code repository with Config Management (Git Repo) – this allows the changes to be “Managed” vs “Unmanaged scripts”.
  • BOTs are wrapped in YAML Definitions and exposed as Service APIs – this allows BOTs to be involved from Third-Party Apps (like ServiceNow).
  • BOTs are “Managed & Supervised Runs” – BOT Orchestrator manages the lifecycle to bring in Security, Compliance, Error Handling and Insights.
  • BOTs have a Lifecycle for Intelligent Maturity.
  • Open Source Platform that can be extended and integrated with existing tools on a journey to achieve AIOps Maturity.
  • Very deeply embedded with ServiceNow and leverages data and transaction integration in a bi-directional way.

The following image explains the RLCatalyst BOTs Service Management Architecture.

How does RLCatalyst BOTs Service Management work?
Integrating your ServiceNow instance with RLCatalyst BOTs Server helps you to publish self-service driven automation to your ServiceNow Service Portal without the need for custom coding or form design. Your users can order items from the Service Catalog which are then fulfilled by BOTs while maintaining record of the transactions in ServiceNow via Service Requests.

The ServiceNow administrator first downloads the scoped application and installs it in her ServiceNow instance. The application can be deployed from the Github repository provided by Relevance Lab. In the near future, this application will also be available from the ServiceNow Application Store.

Once installed, the application is configured by the ServiceNow Administrator. The person will fill the “BOTs Server Configuration” form. The required parameters are BOTs Server URL, Server Name, Is Default, Username and Password. This information is stored in the ServiceNow instance and is then used to discover and publish BOTs from the RLCatalyst BOTs Server.

The application administrator clicks on the Discover BOTs screen to retrieve the list of latest BOTs available on the BOTs Server. Once this list is displayed, the administrator can choose the BOTs person wants to publish and select the kind of workflow person wants to associate with that BOT (none, single or multi-level approvals). Then person clicks on the Publish button on doing which the BOTs are published to the Service Portal along with all the Forms associated with the BOT for input.

End-users can then use the self-service Catalog items to request fulfilment by BOTs.

What is the standard library of RLCatalyst BOTs available with this solution?
RLCatalyst provides a library of 100+ BOTs for common Service Management tickets and can help achieve up to 30-50% automation with out-of-the-box functionality across multiple functionalities as explained in diagram below.

  • User Onboarding and Offboarding.
  • Cloud Management.
  • DevOps.
  • Notification Services.
  • Asset Management.
  • Software and Applications Access Management.
  • Monitoring and Remediation.
  • Infrastructure Provisioning with integration to AWS Service Catalog.

Summary of Solution benefits
The RLCatalyst BOTs Service Management connector is providing an enterprise wide automation solution integrating ServiceNow to Hybrid Cloud assets with an ability to have self-service models. The automation of Asset and Service requests provides significant productivity gains for enterprises and in our own experience has resulted in achieving 10 FTE productivity, 70% automation of inbound requests and more than US $500K of annual savings on operations costs (including reduced headcount), ITSM license costs, Cloud assets optimized usage with compliance and 50% efficiency gains on internal IT Workflows.

Following are some key blogs with details of solutions addressed with this RLCatalyst BSM connector.


For more details, please feel free to reach out to marketing@relevancelab.com



0

2020 Blog, Blog, Featured

Relevance Lab in partnership with AWS has launched a new solution to help self-service collaboration for Scientific Computing using AWS Cloud resources. Scientific Research is enabling new innovations and discoveries in multiple fields to make human life better. There are the large and complex programs funded by governments, public sector and private organizations. Every higher education institution and universities globally have specialized focus on Research Programs.

Some research institutions already use an existing ITSM Portal for self-service and our previous blog explains the solution integrated with such popular ITSM tools like ServiceNow – AWS Research Workbench. In this blog we cover the common scenario of research institutions for an open source based custom self-service platform that is needed to integrate a community within the institution and also with outside organizations in a federated manner.

Why do we need an RLCatalyst Research Gateway cloud solution?
Research is a specialized field with the community focussing on using “Science” to find common solutions to human problems in areas of Health and Medicine, Space, Earth etc. The need to drive frictionless research across geographies requires ability to focus on “Science” while addressing the specific needs of People-Programs-Resources interactions. The “RLCatalyst Research Gateway” acts as a bridge provisioning seamless and secure interactions, access to programs and budgets with ability to consume and manage lifecycle of research related computational and data resources.


PEOPLE Specialized group of Researchers collaborating across organizations, disciplines and countries with open collaboration needs.
PROGRAMS Specialized research programs, funding, grants, governance, reporting, publishing outcomes etc.
RESOURCES High Performance Computing resources, large data for studies, analytics models, data security and privacy considerations, sharing and collaboration, Common Federated Identity and Access Management etc.

The key requirements for Cloud based RLCatalyst Research Gateway are following.

  • Standard Research Needs
    • Roles, Workflows, Research Tools, Governance, Access and Security, Integration.
    • People-Programs-Resources Interactions.
    • Intramural and Extramural Research.
    • Infrastructure, Applications, Data, and Analytics.
  • Built on Cloud
    • Easy to deploy, consume, manage and extend – should align with existing infrastructure, applications, and cloud governance.
    • Leverage AWS Research products.
  • Leverage Open-Source with an enterprise support model
    • Supports both Self-hosting and Managed Hosting options.
    • Cost effective – pre-built IP and packaged service offerings.

The diagram below explains the RLCatalyst Research Gateway cloud solution. The solution provides researchers with one-click access to collaborative computing environments operating across teams, research institutions, and datasets while enabling internal IT stakeholders to provide standard computing resources based on a Service Catalog, manage, monitor, and control spending, apply security best practices, and comply with corporate governance.

Using the RLCatalyst Research Gateway cloud solution
The basic setup models a research institution or university with need to have support for different research departments, principal investigators, researchers, project catalogs and budgets. The diagram below explains a typical setup of the key stakeholders and different entities inside the RLCatalyst Research Gateway.

  • Research Organization/Institution.
  • Research Departments.
  • Principal Investigators.
  • Researchers.
  • Site Administrator.
  • Project Catalog of Cloud Products.
  • Budget for Project and Researcher.

RLCatalyst Research Gateway solution map
There are three key role based functionality built into the RLCatalyst Research Gateway solution related to following.

  • Researcher Workflows.
  • Principal Investigator Workflows.
  • Site Administrator Workflows.

The RLCatalyst Research Gateway solution components
A number of AWS components have been used to build the RLCatalyst Research Gateway solution to make it easier for the research community to focus on science and not the headaches of managing cloud infrastructure. At the same time existing investments of Research Institutions on AWS are leveraged and best practices integrated without need for custom or proprietary solutions. Following is a sample list of AWS products used in RLCatalyst Research Gateway and more products can be easily integrated.

  • AWS Service Catalog – Core products available for Research Consumption.
    • AWS SageMaker Notebook.
    • AWS EC2 Instances.
    • AWS S3 Buckets.
    • AWS Workspaces.
    • AWS RDS Data Store.
    • AWS HPC high performance computing.
    • AWS EMR.
  • AWS Cognito for Access and Control.
  • AWS Control Tower for Account management and governance.
  • AWS Cost Explorer and Billing for Project and Researcher budget tracking.
  • AWS SNS and AWS Eventbridge for Notification Services.
  • AWS Cloudformation for template designs.
  • AWS Lambda for Serverless computing.

The RLCatalyst Research Gateway solution created in partnership with AWS is available in an Open source model with Enterprise Support options. The solution can be deployed in Self-hosted Cloud or used in a Managed Hosting model with customizations options available as needed.

For a demo video please click here

For more details, please feel free to reach out to marketing@relevancelab.com



0