Introduction Automation testing refers to the testing of the software in which tester write the test script once with the help of testing tools and framework and run it on the software. The test script automatically tests the software without human intervention and shows the result.
Prerequisites for Automation: To begin automation testing using Selenium, there are few prerequisites should have in place:
Programming Language: Choose a programming language in which you will write your test scripts. Java, Python, C#, and Ruby are common choices for Selenium automation. You should have a good grasp of the chosen language.so we have chosen java.
Integrated Development Environment (IDE): Install an integrated development environment (IDE) such as Eclipse, IntelliJ IDEA, or Visual Studio Code to write and manage your Selenium scripts. We have chosen Eclipse as IDE platform.
Java Development Kit (JDK): If u opt for Java, you’ll need to install the Java Development Kit (JDK) on your system.
Selenium WebDriver: Download the Selenium WebDriver for your preferred programming language. You can add the WebDriver libraries to your project using build tools like Maven.
Web Browsers: Make sure you have the web browsers you intend to automate (e.g., Chrome, Firefox) installed on your system.
Web Drivers: Selenium interacts with browsers through web drivers. You should have the appropriate web drivers for the browsers you plan to test with (e.g., Chrome Driver, Gecko Driver). These should be downloaded and configured. Ensure that your chosen IDE is integrated with the Selenium WebDriver, making it easier to write, run, and debug test scripts.
Test Framework: Select a test framework such as TestNG or Hybrid. Test frameworks help structure your tests and provide reporting capabilities.
By meeting these prerequisites, you’ll be well. prepared to start automation testing using Selenium and create efficient, maintainable, and effective test scripts.
The following below video showcases a small presentation on automation testing:
Tools used for Automation testing.
Eclipse: Eclipse is a digital workspace for automation testing. It’s a special software that makes creating and running automated tests easier. It is used by testers because it’s super flexible and works with different programming languages. It helps in writing, organizing, and running tests for finding problems or bugs in software, making sure everything works smoothly.
Selenium: Selenium is an Open source, It Supports multiple languages like java, python, Ruby. Scripts can be run on Multiple browsers like Chrome, Firefox, IE, Microsoft Edge. It supports Multiple operating systems like Linux, Windows, MacOS. It can be integrated with third party applications like TestNG, Cucumber.
TestNG: TestNG is a testing framework used with Selenium for automating tests. It allows for organizing test cases, running tests in a specific sequence. It offers annotations to manage test execution flow, such as @Test for defining test cases, @BeforeMethod and @AfterMethod as you can see the highlighted point in the picture for pre and post-test setups, and @DataProvider for parameterization. This combination of Selenium and TestNG helps in efficient and structured automation testing, enabling testers to create, manage, and execute test cases reliably. Generates HTML reports showing test execution details.
Maven: Maven is an open-source tool. Maven allows to download all the JARs and Dependencies and manage lifecycle for a Selenium Java project. This makes it easier for the QA to configure dependencies for Selenium Java by automatically downloading the JARs from the Maven repository.
Advantages of Automation testing
Efficiency: Automation testing can execute repetitive and complex tasks faster than manual testing. Accuracy: Automated tests perform the same steps precisely every time, reducing the chance of human errors. Reusability: Test scripts can be reused across different phases of development and in various testing scenarios. Faster Feedback: Automated tests provide rapid feedback on the software’s stability and functionality.
Let us see how we can execute the test cases in the below video:
Transform allows you to manipulate attributes values while provisioning to a source. It will help in manipulating any incoming data from the source as per the requirement.
Transforms are configurable objects that define easy ways to manipulate attribute data without requiring you to write code. Transforms are configurable building blocks with sets of inputs and outputs.
As we can see in diagram there is Input-Transform-Output. In input the value or data is coming from identity attributes or Account attribute and according to requirement we will write the transforms and in output we can see our result.
The Transform syntax has the following properties:
The basic requirement for a transform is name, type, attributes.
For name, we can take any objects and that will reflect to your identity profile.
For Type, we can put the transforms according to your requirement.
For inputs, the developer can decide whether we want to take the value from identity attribute or Account attribute.
Basic String Operations
These are basic string operation there are 18 transform. These transform are commonly used in any operation.
I will discuss each and every transform.
1)Base64 Decode– This transform is used for converting String to Base64. Basically it used for decoding purpose.
2)Base64 Encode-This transform is used for converting Base64 to String. Basically it used for encoding purpose.
3)Concatenation-Concatenation transform is used for Combining two string. This transform basically used to combined first name and last name.
4)Index Of– The index of transform is used to get the location of a specific substring. Suppose that if we give the string to find the index of a string if its found it will return the index number, if doesn’t find it will return -1.
5)Substring– Substring transform is used to take the specific part of the string with provided begin index and end index.
6)Split– Split Transform basically used to split the string based on the provided delimiter. This transform is often useful when you want to split combined names into their constituent parts or when you want to simplify an ordered list of values into a single attribute.
7)Leftpad – Left pad transform to pad the string left side with a user-supplied character out to a specific number of characters. This transform is often useful for data normalization situations. such as user IDs are not uniform in length.
8)Right pad– Right pad transform to pad the string right side with a user-supplied character out to a specific number of characters. This transform is often useful for data normalization situations. such as user IDs are not uniform in length.
9)Replace– Replace transform is used for replace the specific string based on the provided regex.
10)Replace All– Replace All transform used to replace the string based on the provided table attribute of key-value pairs as an argument.
11)Upper– Upper transform use to convert an string into uppercase letters.
12)Lower– Lower transform use to convert an string into lowercase letters.
13)Static– Static transform is use to return a fixed string value, or more commonly, to evaluate Velocity. Static transform can also take other dynamically provided variables as inputs into the value attribute.
14)Last Index of – The last index of transform is used to get the last location of a specific substring.
15)Trim– Trim transform used to trim whitespaces from both the beginning and ending of input strings.
16)Get End of String– Get end of string transform as an out-of-the-box rule transform provided through SailPoint’s Cloud Services Deployment Utility rule. The transform allows you to get the rightmost N characters of a string.
2)Date Format:- The date format transform is used to convert datetime strings from one format to another. It is useful when you are syncing data from one system to another, because each application uses a different format for date and time data.
3)Date Math:- Date Math transform can be used for performing mathematical operation like addition, subtractions and rounding of a timestamp.
It also allows you to work with a referential value of “now” to run operations against the current date and time instead of a fixed value.
Under generators Transform there are six different transforms are present.
Under generators Transform there are six different transforms are present.
1)Generate Random String- Generate Random String Transform provided through SailPoint’s Cloud Services Utility rule. This transform allow us to generate a random string of any length.
2) Random Alphanumeric:- The random alphanumeric transform is used to generate a random string of any provided length, if we do not provide the length it will give the default output that is 32 char. comprising both numbers and letters (both lowercase and uppercase). The maximum allowable value is 450 characters.
3)Username Generator:- To set the logic to use when it determines a unique value for an attribute in an account create profile, utilize the username generator transform. The logic of the generator can be as basic as combining elements of an HR record or the user’s name.
4)Name normalizer:- The name normalizer transform is used to clean or standardize the spelling of strings coming in from source systems. The most common use for this transform is for names.
5)Random Numeric:- The random numeric transform is used to generate a random number of any length. The transform defaults value is 10 char. and maximum allowable value is 450 characters.
6)UUID Generator:-The UUID generator transform is use to create a universal unique ID (UUID) in the form of a 36-character string.
Under extending transforms there are two transforms.
1)Reference transform:- The reference transform is used to reuse a transform that has already been written within another transform. We can use this transform when you want to repeat the same logic multiple times within other transforms.
2)Rule Transform:- Rule transform allows you to reuse logic that has already been written for a previous use case. you can use the rule transform to reuse code contained within a Generic rule.
API stands for Application Programming Interface. APIs are mechanisms that enable two software components to communicate with each other using a set of definitions and protocols.
API architecture is usually explained in terms of client and server. The application sending the request is called the client and the application sending the response is called the server.
Fig. – API dataflow
What is REST API:
REST stands for Representational State Transfer. This is the most popular and flexible APIs found on the web today. The client sends requests to the server as data. The server uses this client input to start internal functions and returns output data back to the client. REST defines a set of functions like GET, POST, PUT, DELETE, etc. that clients can use to access server data. Clients and servers exchange data using HTTP.
The main feature of REST API is statelessness. Statelessness means that servers do not save client data between requests. Client requests to the server are similar to URLs you type in your browser to visit a website. The response from the server is plain data, without the typical graphical rendering of a web page.
Post Operation: POST APIs request allows appending data to the endpoint. This is a method used to add information within the request body in the server. It is commonly used for passing delicate information.
GET operations: GET APIs request is used to obtain details from the endpoint and does not have any impact on the endpoint. The GET request does not update any endpoint data while it is triggered.
UPDATE operations: PUT APIs request is used to pass data to the server for creation or modification of an endpoint. The difference between POST and PUT is that POST request is not idempotent.
DELETE operations: DELETE APIs request deletes a resource already present in the server. The DELETE method sends a request to the server for deleting the request mentioned in the endpoint.
Let us understand usage of REST API’s in SailPoint IdentityNow in the following below presentation:
Base URL of SailPoint tenant.
secret key and client ID for the token generation.
Authentication is the process of determining whether someone or something is, in fact, who or what it says it is. Authentication provides access control for systems by checking to see if a user’s credentials match the credentials in a database of authorized users or in a data authentication server. In doing this, authentication assures secure systems, secure processes and enterprise information security.
OAuth 2.0 is the industry-standard protocol for AUTHORIZATION.
OAuth 2.0 is designed primarily as a means of granting access to a set of resources, in simple way OAuth 2.0 Access Token is a string that the OAuth client uses to make requests to the resource server.
JSON Web Token
JSON Web Token (JWT) authentication is a stateless method of securely transmitting information between two parties as (JSON) object. It is often used to authenticate and authorize users in web applications and APIs.
Rest API Authorization in IdentityNow
Authorization in system security is the process of giving the user permission to access a specific resource or Authorization is the act of validating the user’s permission to access a given resource. This term is often used interchangeably with access control or client privilege.
Personal Access Token in IdentityNow
In IdentityNow a personal access token (PAT) is a method of authenticating to an API as a user without providing a username and password.
Now, let us go through a demo on how we can use these REST API’s in SailPoint IdentityNow.
Features of Rest API in IdentityNow
APIs extend IdentityNow functionality and Usability
Advanced configuration such as
Customization of account profiles
Ranking authoritative source priority
System level changes
Interface with other systems – pull data/initiate processes
Identity management (IDM), also known as identity and access management (IAM), ensures that authorized people and only authorized people have access to the technology resources they need to perform their job functions.
And access is managed by the user lifecycle state in IdentityNow. Identity Lifecycle State aims to automate and manage the entire digital identity lifecycle process and access throughout the organization.
Identity lifecycle is a set of stages of the identity from the creation to its deactivation or deletion. It contains a creation of an account, assignment of correct groups and permissions, setting and resetting passwords and in the end deactivation or deletion of the account.
Handling the unwanted identities in SailPoint increases the processing time and reduces the usability of the SailPoint tenant. To reduce the process and speed up the work, in tenant only limited and require identities we can handle, handling is easy and processing the limited identities is a less time-consuming process, so we can delete unwanted and terminated users’ identities from SailPoint.
Now, let us have a look at the SailPoint REST API’s used in the Identity deletion process. Below is the list of APIs used for Identity deletion in SailPoint IdentityNow:
Authentication: This is used to create an access Token (Bearer token).
Search – Perform Search(v3 API): This is used to fetch the all “30daysPostTermination” or “terminated” lifecycle state identities.
And here, we will be using A personal access token (PAT) is a method of authenticating to an API as a user without providing a username and password.
Prerequisites for Identity deletion:
SailPoint REST API’s.
Client ID and Client Secret.
Now, let us discuss the use case of Identity deletion.
All the identities in the “30daysPostTermination” lifecycle state will be deleted from IdentityNow.
The deleted identities would be re-aggregated in the next aggregation cycle as “Uncorrelated accounts” in target application, and hence would not affect the new hire creation logic and the SAMAccount name would remain unique as per the requirement and the logic defined.
A PowerShell script will be developed to call the APIs to identify all the Identities in the required lifecycle state i.e. “30daysPostTermination” and will delete the accounts from the HRMS Source for all the Identities.
Steps Overview as per the script:
Step1: As part of the PowerShell script first it will read the require details from property file. In property file we can maintain the ClientID, client Secret, base URL, search query, deletion limit, log file path and debug values.
Step3: Next Search API will execute and the fetch “30daysPostTermination” lifecycle state identities from SailPoint Tenant.
Step4: One by one Identities will pass to Delete API to delete from SailPoint Tenant.
Let us understand Identity Deletion by using SailPoint REST APIs, use cases and automation of the script via windows task scheduler in the following below presentation:
Advantages of Identities Deletion in SailPoint IdentityNow.
It will increase the usability of the tenant.
It decreases the aggregation and identity refresh process time.
It will fasten the backend processes and reduce the unwanted identity handling.
Reduce the burden on the tenant.
When a user got terminated or left the organization, all access will be removed, and accounts will be disabled. Now, let us go through a demo on how we can achieve identity deletion in SailPoint IdentityNow.
This blog is intended to demonstrate on automating email notifications for newly on-boarded contractors from IdentityNow. This will help in sending auto email notifications to users & their managers (if required) to reset their first password. This is enabled by running a PowerShell script in a shared folder in the IQ Service Server. In the current process, the IT help desk team needs to reach out to the user for his first login. With the help of PowerShell script, this process can be automated by sharing the password reset link automatically.
Use Case Diagram
The above diagram depicts the overall process flow of the use case with the point of initiation being the IQ Service Server following with the SMTP server.
Current IdentityNow templates don’t have email notification which will send Email ID ,password reset link and user manual to end user on his first day to instantly. To achieve this requirement, we have written PowerShell script and rule to achieve desired requirement above diagram gives the overview of how we have achieved this requirement.
From UI Request center, HR or Manager will request for an AD account depending on the license to be assigned for the user to be on-boarded.
Once request is completed Active Directory account will be created and the After create rule will be triggered. By using this rule, we are triggering PowerShell script which is placed in IQ service server for sending Email by using SMTP server containing Email Id, Password reset link & user manual. We can also edit the contents to be shared in the email based on the organization requirements.
Detailed discussion on the overall Use case, communication flow and the advantages:
Introduction to IQ Service Server
The IQ Service, also known as the Integration Service, is a native Windows service that allows Identity Now to participate in a Windows environment and access information that is only accessible via Windows APIs.
It is a lightweight service that must be installed on any supported Windows Server that has connectivity to the target systems you want to manage in Identity Now.
It also secures all incoming & outgoing communications of the server. Overall security of the solution and data integrity will be ensured even in crucial stage.
We can create several instances on the same machine as per the system requirements.
This server is primarily responsible for provisioning in AD from IdentityNow.
IQ Service Communication Flow
IdentityNow always push task to a VA cluster queue and from cluster queue, VA will pull the request based on the priority of task.
Once request is fetched by VA, VA will communicate to IQService for tasks such as aggregation, create and modify the accounts.
IQService server communicates with domain controller using LDAP/LDAPS.
IQService receives the data from domain controller and gives it back to VA (Outbound traffic).
Finally VA will give the updated results to the tenant and requests for the new task.
Rule Execution process in IdentityNow
Rule execution can be executed in 2 primary places:
Connector Rules are rules that are executed in the IdentityNow virtual appliance, and they are usually extensions of the connector itself. The rules are particular to only certain connectors since they are frequently applied to carry out complex connector-related tasks. Because these rules function within the virtual appliance, they are unable to access IdentityNow’s data model or collect data from it.
The basic logic required to initiate a PowerShell script is derived from the after-creation rule, which then transfers the majority of the subsequent events and/or modifications to the PowerShell script itself. Since this script would be stored on the client’s servers, the customer could easily modify it as needed. Since the code runs outside of the IdentityNow platform, it allows the client to add updates to the PowerShell scripted functionality without requiring SailPoint to review the code.
Demonstration of the use case in IdentityNow
Use of Powershell script in IdentityNow
The popularity of scripting languages with Object Oriented capabilities—like PowerShell is because of their simplicity and use.
These languages’ native scripts can access request and result objects more quickly and effectively.
The Utils.dll class library that is bundled with the IQ Service contains all the necessary classes to access the request and result objects. Process environment variables would be presented as inputs to the script.
The environment variables contain XML-based data. Using Utils.dll, the script creates the appropriate objects.
Once the object is modified, the script should execute the object’s xml() function to convert it to XML and then send the XML to the path mentioned in the script’s single argument.
In the event of an error, the script generates a non-zero value and logs the message in the appropriate file at the specified directory.
Before/ After Scripts for IQ Service
The IQ Service allows function customization by allowing the integration of before/after scripts developed using scripting languages such as PowerShell.
Any required tasks that cannot be automated with the current source functionalities can be automated with scripts.
Native before scripts are scripts that are called before the request is processed; native after scripts are scripts that are called after the request is processed.
The following sources support Before/After Scripts for IQ Service:
About Oracle Fusion: Oracle Fusion Cloud HCM is a complete cloud solution that connects every human resource process and every person across the enterprise.
Benefits of Oracle Fusion:
Oracle HCM cloud enables HR leaders by delivering an end-to-end solution to manage every stage of the employee lifecycle.
Human capital management transforms the traditional administrative functions of human resource departments – recruiting, training, payroll, compensation, and performance management into opportunities to drive engagement, productivity, and business value.
It also offers Data Efficiency by preserving history of changes made to the attributes of some objects. As a Professional user, you can retrieve and edit past and future versions of an object.
Many HCM objects, including person names, assignments, benefits-plans, grades, jobs, locations, payrolls, and positions are date-effective. Date-effective objects include one or more physical records. Each record has effective start and end dates. One record is current and available to transactions. Others are past or take effect in the future. Together, these records constitute the logical record or object instance.
Base URL: The base URL is used to connect to the web service managed system.
Authentication Method: Authentication methods that is supported are: OAuth2, API Token, Basic Authentication and No/Custom Authentication
Schema Attribute for Account Enable status: Attribute name and value required to be provided to check the Enable status. For example: status=Active
Username: Username of the resource owner
Password: Password of the resource owner
Grant Type: We can select the type of Grant from below: Refresh Token, JWT, Client Credentials, Password and SAML Bearer Assertion
Client Id: (Optional for JWT and SAML Bearer Assertion) Client Id for OAuth2 authentication.
Client Secret: (Optional for JWT and SAML Bearer Assertion) Client Secret for OAuth2 authentication.
Token URL: URL for generating access token.
Basically, connecting SailPoint to your Web Services allows you to configure any web service supported managed system which can read and write on the managed system using the respective managed system’s Web Services. Web Services supports JSON and XML for read and write.
Now, let us have a look at the Oracle Fusion REST APIs used in the Integration process. Below is the list of APIs used for integrating Oracle Fusion to SailPoint:
Get all Workers API: This fetches all the worker records as of the specified date. Worker types include employee, contingent worker, and pending worker. By default, the current date is retained.
Get all User Accounts API: This fetches all the userAccounts. We may need to manage user accounts for the workers to assign or revoke Fusion Roles
Get all Roles API: This is used the get the roles assigned to the user accounts.
Create Employee API: This is used to create an Employee record in Oracle Fusion.
Update Employee API: This is used to update an Employee record in Oracle Fusion.
Terminate Worker API: This is used to Disable a Worker record in Oracle Fusion.
Role Revoke API: This is used to revoke an assigned role.
Rehire Employee API: This is used to Enable an Employee record in Oracle Fusion.
And here, we will be using Basic Authentication in the Integration Process where we have used Username and Password of the resource owner to from connection between Oracle Fusion and SailPoint IdentityIQ
Prerequisites for integrating Oracle Fusion with SailPoint
Oracle Fusion REST APIs
Oracle Fusion instance
Base URL of the Oracle Fusion instance
As we are using Basic Authentication, we need username and password for the APIs
Now, let us discuss the use cases involved in the integration process.
1. Joiner/Create account Process
Joiner Process starts with the creation of account in Truth Source application. Then that account will be brought to SailPoint through a scheduled aggregation task.
Then through a Scheduled Refresh Identity Cube task and using a configured Business Role and Assignment Rule, two conditions are checked.
Identity or Account in Truth Source is Active
Business Unit is XXXX
If these two conditions are satisfied. Then the account creation process for Oracle Fusion application gets triggered. As part of the Joiner or Account creation process, Basic access will be provisioned to the newly created account from the Fusion end automatically.
2. Leaver/Disable Account Process:
Leaver process starts when the Last working day attribute of account is populated in Truth Source application.
Then that Last working day for that account will be updated in SailPoint through a scheduled aggregation task. Then through a Scheduled Refresh Identity Cube task, it is checked that if that Last working day is equal to Today’s date. In other words, it is checked that if the Last working day has reached.
If the Last working day has reached, then the account disable process for Oracle Fusion application gets triggered.
As part of the Termination process, all the Roles which that account has, will be de-provisioned.
3. Rehire/Enable Account Process:
Rehire Process starts by enabling the account in Truth Source application. Then that is updated in SailPoint through a scheduled aggregation task.
Then through a Scheduled Refresh Identity Cube task, the account enable process for Oracle Fusion application gets triggered.
As part of the Rehire/Enable account process, another new Assignment is created for that account with AssignmentName and AssignmentNumber appended with “R”. And Basic access will be assigned to that enabled account as we saw in the create account use case.
Advantages of Integrating Oracle Fusion with Sailpoint IdentityIQ:
In this integration, we have automated the creation of account in Oracle Fusion. Whereas before the integration, Oracle Fusion team had to create an account in Oracle Fusion manually for the new Joiners
In this integration, we have also automated the Disabling and Role Revoke operations. Actually, when the Last Working Date of the user has reached, then the Oracle Fusion account of this leaver will be disabled and Oracle Fusion Roles assigned to this leaver will Revoked, by Sailpoint. Whereas, before the integration, Oracle Fusion team had to disable the account and Revoked Roles in Oracle Fusion manually for the leavers.
In this Integration, we have also automated the update operation as well. From Sailpoint, we are updating the attributes such as LegalEmployerName, Department, Job, Grade, FirstName, LastName, DisplayName, etc. Whereas, before the integration, Oracle Fusion team had to handle these updates manually which will be a tedious task to do.
We have also automated the Rehire Process. When a user got rehired, then for that user, Oracle Fusion account will be enabled and a new work relationship will be created from Sailpoint.
When a user got Rehired, only the Basic access will be given, not the access that user had before Leaver process for that user initiated. The detailed discussion of Oracle Fusion introduction, Oracle Fusion TEST APIs, use cases and integration approach is discussed in the following video:
Now, let us a have a demo on integration of Oracle Fusion with SailPoint IdentityIQ in the following video:
Before going through the Integration of Genie with SailPoint IdentityIQ, let us understand what a ticket and a ticketing system is.
What is a ticket?
A Ticket is a special record that represents an incident, request or event that requires action from the IT department. It contains the necessary details of the incident, request or event.
A ticketing system is a software platform designed to manage and track customer support requests. It streamlines the process of resolving customer issues, making it easier for businesses to provide fast and effective support.
Benefits of embracing a ticketing system in your organization:
Control High Volume of Requests from a Centralized Place – Organizations can track and manage inbound support requests with the help of a good ticketing system. The solution can be used by executives to manage support cases more efficiently while still attending to all client issues.
Combine interactions into one thread – Your team can use ticketing system to combine customer-related conversations into a single thread when offering customer care to clients via a variety of channels.
Process automation and workload management – Ticketing systems provide several potentials for automation. As an illustration, the software gathers assistance requests from several sources before automating the creation of tickets. Regardless of the help channel clients select, tickets are automatically created whenever they submit requests.
Adequate team collaboration – Ticketing systems provide a platform to the customer service representatives to collaborate among themselves in assigning tickets to senior associates in terms of P0 escalations.
Genie is a ticketing tool, in which there are different types of tickets such as Work Order tickets, Incident tickets, etc. SailPoint handles creation of below tickets in Genie from SailPoint.
Exit or Off-boarding ticket
Provision failure ticket
Among the above tickets, Onboarding and Off-boarding tickets are Work Order tickets and Provision failure ticket is an Incident ticket.
Integration with SailPoint IdentityIQ:
Genie is integrated with SailPoint using REST APIs.
Below is the high-level architecture of Genie – SailPoint Integration
Now, let us have a look at the APIs used in the Integration process. Below is the list of APIs used for creating tickets and getting specific ticket details:
Generate Authentication key: We are using this API to generate an Authentication key. Here, the generated Authentication key expires for every few minutes.
Creating Work Order ticket: We are using this API to create a Work Order ticket in Genie.
Get specific Work Order ticket: We are using this API to Get a specific Work Order ticket details. In other words, it is used to Retrieve the Work Order ID of the submitted request.
Creating Incident ticket: We are using this API to create a Work Order ticket in Genie.
Get specific Incident ticket: We are using this API to Get a specific Incident ticket details. In other words, it is used to Retrieve the Incident number of the submitted request.
Prerequisites for integrating Genie with SailPoint:
Following are the prerequisites that are essential for integration of Genie with SailPoint IdentityIQ
Genie API details for creating tickets
Base URL and Authentication details of the APIs
Access to Genie Test Instance
Database table to store the ticket details
Connection details of the Database
Now, let us discuss the use cases involved in the integration process.
Onboarding Ticket Creation:
An Onboarding Work order ticket will be created in Genie from SailPoint when a new joiner joins the organization and his/her account is created in Truth Source and Active Directory applications.
The ticket contains details of the New Joiner such as Start Date, First Name, Last Name, Employee ID, Employment Type, AD ID, Domain ID, Contact Information, etc.
Once the ticket gets created in Genie successfully, the details of the created ticket such as Work Order ID, Creation Date, Request ID and ticket summary will be added to the database table.
If the ticket creation fails for some reason, then an Email notification containing the New Joiner’s details, will be triggered to the respective stakeholders.
Exit or Off-boarding Ticket Creation
An Exit or Off-boarding Work order ticket will be created in Genie when HR assigns the Last working date of the user in Truth source and that Last working date is 7 days away from today’s date.
The ticket contains details of the End dated user such as Last Working Date, First Name, Last Name, Employee ID, Employment Type, AD ID, Domain ID, Contact Information, etc.
Success case: Once the ticket gets created in Genie successfully, the details of the created ticket such as Work Order ID, Creation Date, Request ID and ticket summary will be added to the database table. Failure case: If the ticket creation fails for some reason, then an Email notification with the End Dated user’s details, will be triggered to the respective stakeholders specifying.
Provisioning Failure (Incident Ticket)
A Provisioning Failure (Incident) ticket per application will be created if the provisioning operations failed in SailPoint in last 24 hours. In this use case, applications under consideration are Active Directory, G-Suite and OpenLDAP and operations under consideration are Create, Enable, Disable and Delete.
For example, if the Disable operation failed in Active Directory for an account, then a ticket (which will be of Incident type) will be created in Genie containing the details such as Type of operation failed and Display Name, Employee ID, email, sAMAccountName, userPrincipalName, distinguishedName, etc details of the account.
Once the ticket gets created in Genie successfully, the details of the created ticket such as Incident Number, Creation Date, Request ID and ticket summary will be added to the database table.
If the ticket creation fails for some reason, then an Email notification will be triggered to the respective stakeholders specifying Type of operation failed, Application name in which Provisioning failed and Display Name, Employee ID, email, etc. details of the account.
Now, let us understand the advantages of integrating a ticketing system with SailPoint IdentityIQ when compared with a traditional ITSM.
Advantages of Integrating Genie with SailPoint IdentityIQ:
In this integration, we have automated the creation of onboarding tickets for New Joiners. Whereas in traditional ITSM’s, the ticket should be created manually by the end user.
While creating tickets in Genie, SailPoint uses the data coming from Truth Source which will be updated by the HR team. Whereas in traditional ITSM’s, there is every chance that end user does not give the mandatory details required by IT team to perform necessary action on that ticket or the end user mistakenly may also enter incorrect details. So, IT team might need to contact the HR team or the end user, which is a time-consuming process.
In this integration, we will be using the details of the user as per the Truth Source application while creating the ticket. So, the IT team may not wait for the communication from the concerned team.
Whereas in traditional ITSM’s, when the ticket is created, the IT team should get certain required details from the concerned team manually. The Exit ticket will be created automatically when the end-dated user is one week away from the Last Working date, which gives the IT team ample amount of time to take necessary actions such as collection of assets and disabling of access.
In this integration, we are storing the ticket details such as Work Order ID, Incident Number, Creation Date, Request ID and Summary in the database table for all the tickets created from SailPoint. So, we can use this table’s data for auditing purposes which ensures centralized governance.
The Incident tickets are created every day (one ticket per application) from SailPoint in Genie for applications such as Active Directory, G-Suite and OpenLDAP and for operations such as Create, Enable, Disable and Delete, if the provisioning fails in these applications. The point to be noted here is, if in an application, no provisioning operations failed for that day, then no ticket will be created for that application for the particular day. But, there is every chance that, one ticket per application will be created where each ticket contains the information of all the accounts for which Provisioning failed. Using SailPoint, this is one of the critical advantages of integrating genie with SailPoint IIQ. Whereas in traditional ITSM’s, this needs to be done manually, which will be a tedious task to do.
The detailed discussion of APIs, use cases and integration approach is discussed in the following video:
Now, let us a have a demo on integration of Genie with SailPoint IdentityIQ in the following video:
SuccessFactors is an SAP product suite to provide cloud-based solution to manage business alignment, people performance, recruitment, and employee central and learning activities for all sizes of organizations.
SAP SuccessFactors is cloud based HCM solution and is designed on Software as a Service (SaaS) cloud model. Software as a Service is also known as On-demand software solution where software is licensed on a subscription basis and is centrally hosted.
SAP SuccessFactors integration with SailPoint IdentityNow Blog
At-least Virtual Appliance need to be configured in order to have communication between IdentityNow cloud and SAP source however SailPoint recommends to have 2 virtual appliances in cluster.
Permissions required :
•Test connection : To test the connectivity from IDN cloud to SAP SuccessFactors source.
•Account Aggregation : To aggregate account details to IDN cloud.
•To perform connection tasks, must have the following permissions:
a. SFAPI User Login
b. Employee Central HRIS SOAP API
•For example, The Success Factor source aggregates the employee data from the SuccessFactors managed system based on the Picklist configuration which is a configurable set of options or selection lists used to populate a data input field with one of a number of predefined values in the Success Factors that can be obtained.
Next for aggregation we required the following permission:
Manage User : Employee Export
Metadata Framework : Admin access to MDF OData API
Manage System Properties : Picklist Management and Picklists Mappings Setup
Employee Central API : Employee Central Foundation OData API (read-only), Employee Central HRIS OData API (read-only), Manage Role-Based Permission Access
This type of trigger is used to give the custom application an ability to answer back to a trigger event sent by the trigger service. This integration is bi-directional. A response from the custom application is required for a trigger invocation to be considered complete and successful.
This type of trigger is used to notify the custom application of a particular occurrence of an event. This integration is uni-directional. Trigger invocation is successful the moment the trigger service notifies the external application, and it does not require a response from the custom application.
For example, a user from IT department is able to see Jira, Bitbucket, Administrative / Privileged access across applications like Active Directory, ServiceNow and various other applications in the request center. For a user from Marketing department, the above access is not relevant and with segments, we are abstracting those items. The relevant access for marketing users would be Salesforce CRM and the same will be visible for the users.
In the presentation below, we will be discussing about segments feature in detail :
In the below video, we will provide a practical demonstration on how to configure segments, how it affects the end user perspective using a practical use-case :
Limit end user visibility for applicable access
Only the access that is applicable for a subset of identities and relevant for them is displayed using segments. This helps in avoiding the confusion in finding the right role/access profile while making an access request.
Reduce incorrect access requests
End users shall not make any incorrect access requests because the only access items that they’ll see in the request center are already fine tuned and configured according to the organizational requirement.
Limit accidental provisioning
If presented with a lot of access items, users might request for something that they don’t need. This can be avoided by creating and assigning users to their respective segments based on certain criteria.
Reduce cost of software licensing
Due to accidental access provisioning, users might be consuming additional licenses for access that they do not need which is a major costing risk. This can be avoided by configuring segments.