PAS-C01 Practice Test Free – 50 Real Exam Questions to Boost Your Confidence
Preparing for the PAS-C01 exam? Start with our PAS-C01 Practice Test Free – a set of 50 high-quality, exam-style questions crafted to help you assess your knowledge and improve your chances of passing on the first try.
Taking a PAS-C01 practice test free is one of the smartest ways to:
- Get familiar with the real exam format and question types
- Evaluate your strengths and spot knowledge gaps
- Gain the confidence you need to succeed on exam day
Below, you will find 50 free PAS-C01 practice questions to help you prepare for the exam. These questions are designed to reflect the real exam structure and difficulty level. You can click on each Question to explore the details.
A company is running SAP ERP Central Component (SAP ECC) on SAP HANA on premises. The current landscape runs on four application servers that use an SAP HANA database. The company is migrating this environment to the AWS Cloud. The cloud environment must minimize downtime during business operations and must not allow inbound access from the internet. Which solution will meet these requirements?
A. Design a Multi-AZ solution. In each Availability Zone, create a private subnet where Amazon EC2 instances that host the SAP HANA database and the application servers will reside. Use EC2 instances that are the same size to host the primary database and the secondary database. Use SAP HANA system replication in synchronous replication mode.
B. Design a Single-AZ solution. Create a private subnet where a single SAP HANA database and application servers will run on Amazon EC2 instances.
C. Design a Multi-AZ solution. In each Availability Zone, create a private subnet where Amazon EC2 instances that host the SAP HANA database and the application servers will reside. Shut down the EC2 instance that runs the secondary database node. Turn on this EC2 instance only when the primary database node or the primary database node’s underlying EC2 instance is unavailable.
D. Design a Single-AZ solution. Create two public subnets where Amazon EC2 instances that host the SAP HANA database and the application servers will reside as two separate instances. Use EC2 instances that are the same size to host the primary database and the secondary database. Use SAP HANA system replication in synchronous replication mode.
A company wants to migrate a native SAP HANA database to AWS. The database ingests large amounts of data every month, and the size of the database is growing rapidly. The company needs to store data for 10 years to meet a regulatory requirement. The company uses data from the last 2 years frequently in several reports. This recent data is critical and must be accessed quickly. The data that is 3-6 years old is used a few times a year and can be accessed in a longer time frame. The data that is more than 6 years old is rarely used and also can be accessed in a longer time frame. Which combination of steps will meet these requirements? (Choose three.)
A. Keep the frequently accessed data from the last 2 years in a hot tier on an SAP HANA certified Amazon EC2 instance.
B. Move the frequently accessed data from the last 2 years to SAP Information Life Cycle Management (ILM) with SAP IQ.
C. Move the less frequently accessed data that is 3-6 years old to a warm tier on Amazon Elastic File System (Amazon EFS) by using SAP HANA dynamic tiering.
D. Move the less frequently accessed data that is 3-6 years old to a warm tier on Amazon Elastic File System (Amazon EFS) by using data aging.
E. Move the rarely accessed data that is more than 6 years old to a cold tier on Amazon S3 by using SAP Data Hub.
F. Move the rarely accessed data that is more than 6 years old to a cold tier on SAP BW Near Line Storage (NLS) with Apache Hadoop.
A company is running its on-premises SAP ERP Central Component (SAP ECC) production system on an Oracle database. The company needs to migrate the system to AWS and change the database to SAP HANA on AWS. The system must be highly available. The company also needs a failover system to be available in a different AWS Region to support disaster recovery (DR). The DR solution must meet an RTO of 4 hours and an RPO of 30 minutes. The sizing estimate for the SAP HANA database on AWS is 4 TB. Which combination of steps should the company take to meet these requirements? (Choose two.)
A. Deploy the production system and the DR system in two Availability Zones in the same Region.
B. Deploy the production system across two Availability Zones in one Region. Deploy the DR system in a third Availability Zone in the same Region.
C. Deploy the production system across two Availability Zones in the primary Region. Deploy the DR system in a single Availability Zone in another Region.
D. Create an Amazon Elastic File System (Amazon EFS) file system in the primary Region for the SAP global file system. Deploy a second EFS file system in the DR Region. Configure EFS replication between the file systems.
E. Set up Amazon Elastic Block Store (Amazon EBS) to store the shared file system data. Configure AWS Backup for DR.
A company has grown rapidly in a short period of time. This growth has led to an increase in the volume of data, the performance requirements for storage, and the memory and vCPU requirements for the company's SAP HANA database that runs on AWS. The SAP HANA database is a scale-up installation. Because of the increased requirements, the company plans to change the Amazon EC2 instance type to a virtual EC2 High Memory instance and plans to change the Amazon Elastic Block Store (Amazon EBS) volume type to a higher performance volume type for the SAP HANA database. The EC2 instance is a current-generation instance, both before and after the change. Additionally, the EC2 instance and the EBS volume meet all the prerequisites for instance type change and EBS volume type change. An SAP basis administrator must advise the company about whether these changes will require downtime for the SAP system. Which guidance should the SAP basis administrator provide to the company?
A. The change in EC2 instance type does not require SAP system downtime, but the change in EBS volume type requires SAP system downtime.
B. The change in EC2 instance type requires SAP system downtime, but the change in EBS volume type does not require SAP system downtime.
C. Neither the change in EC2 instance type nor the change in EBS volume type requires SAP system downtime.
D. Both the change in EC2 instance type and the change in EBS volume type require SAP system downtime.
A company is running its SAP S/4HANA production system on AWS. The system is 5 TB in size and has a high performance and IOPS demand for the SAP HANA data storage. The company is using Amazon Elastic Block Store (Amazon EBS) General Purpose SSD (gp2) storage with burstable IOPS to meet this demand. An SAP solutions architect needs to review the current storage layout and recommend a more cost-effective solution without compromising storage performance. What should the SAP solutions architect recommend to meet these requirements?
A. Switch from burstable IOPS to allocated IOPS for the gp2 storage.
B. Replace all the gp2 storage with Provisioned IOPS SSD (io2) storage.
C. Replace all the gp2 storage with gp3 storage. Configure the required IOPS.
D. Replace all the gp2 storage with gp3 storage at baseline IOPS.
A company is hosting an SAP HANA database on AWS. The company is automating operational tasks, including backup and system refreshes. The company wants to use SAP HANA Studio to perform data backup of an SAP HANA tenant database to a backint interface. The SAP HANA database is running in multi-tenant database container (MDC) mode. The company receives the following error message during an attempt to perform the backup:What should an SAP solutions architect do to resolve this issue?
A. Set the execute permission for AWS Backint agent binary aws-backint-agent and for the launcher script aws-backint-agent-launcher.sh in the installation directory.
B. Verify the installation steps. Create symbolic links (symlinks).
C. Ensure that the catalog_backup_using_backint SAP HANA parameter is set to true. Ensure that the data_backup_parameter_file and log_backup_parameter_file parameters have the correct path location in the global.ini file.
D. Add the SAP HANA system to SAP HANA Studio. Select multiple container mode, and then try to initiate the backup again.
A company is running an SAP HANA database on AWS. The company is running AWS Backint Agent for SAP HANA (AWS Backint agent) on an Amazon EC2 instance. AWS Backint agent is configured to back up to an Amazon S3 bucket. The backups are failing with an AccessDenied error in the AWS Backint agent log file. What should an SAP basis administrator do to resolve this error?
A. Assign execute permissions at the operating system level for the AWS Backint agent binary and for AWS Backint agent.
B. Assign an IAM role to an EC2 instance. Attach a policy to the IAM role to grant access to the target S3 bucket.
C. Assign the correct Region ID for the S3BucketAwsRegion parameter in AWS Backint agent for the SAP HANA configuration file.
D. Assign the value for the EnableTagging parameter in AWS Backint agent for the SAP HANA configuration file.
A global enterprise is running SAP ERP Central Component (SAP ECC) workloads on Oracle in an on-premises environment. The enterprise plans to migrate to SAP S/4HANA on AWS. The enterprise recently acquired two other companies. One of the acquired companies is running SAP ECC on Oracle as its ERP system. The other acquired company is running an ERP system that is not from SAP. The enterprise wants to consolidate the three ERP systems into one ERP system on SAP S/4HANA on AWS. Not all the data from the acquired companies needs to be migrated to the final ERP system. The enterprise needs to complete this migration with a solution that minimizes cost and maximizes operational efficiency. Which solution will meet these requirements?
A. Perform a lift-and-shift migration of all the systems to AWS. Migrate the ERP system that is not from SAP to SAP ECC. Convert all three systems to SAP S/4HANA by using SAP Software Update Manager (SUM) Database Migration Option (DMO). Consolidate all three SAP S/4HANA systems into a final SAP S/4HANA system. Decommission the other systems.
B. Perform a lift-and-shift migration of all the systems to AWS. Migrate the enterprise’s initial system to SAP HANA, and then perform a conversion to SAP S/4HANA. Consolidate the two systems from the acquired companies with this SAP S/4HANA system by using the Selective Data Transition approach with SAP Data Management and Landscape Transformation (DMLT).
C. Use SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move to re-architect the enterprise’s initial system to SAP S/4HANA and to change the platform to AWS. Consolidate the two systems from the acquired companies with this SAP S/4HANA system by using the Selective Data Transition approach with SAP Data Management and Landscape Transformation (DMLT).
D. Use SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move to re-architect all the systems to SAP S/4HANA and to change the platform to AWS. Consolidate all three SAP S/4HANA systems into a final SAP S/4HANA system. Decommission the other systems.
A company has deployed SAP workloads on AWS. The company's SAP applications use an IBM Db2 database and an SAP HANA database. An SAP solutions architect needs to create a solution to back up the company's databases. Which solution will meet these requirements MOST cost-effectively?
A. Use an Amazon Elastic Block Store (Amazon EBS) volume to store backups for the databases. Run a periodic script to move the backups to Amazon S3 and to delete the backups from the EBS volume.
B. Use AWS Backint Agent for SAP HANA to move the backups for the databases directly to Amazon S3.
C. Use an Amazon Elastic Block Store (Amazon EBS) volume to store backups for the Db2 database. Run periodic scripts to move the backups to Amazon S3 and to delete the backups from the EBS volume. For the SAP HANA database, use AWS Backint Agent for SAP HANA to move the backups directly to Amazon S3.
D. Use Amazon Elastic File System (Amazon EFS) to store backups for the databases.
An SAP engineer has deployed an SAP S/4HANA system on an Amazon EC2 instance that runs Linux. The SAP license key has been installed. After a while, the newly installed SAP instance presents an error that indicates that the SAP license key is not valid because the SAP system’s hardware key changed. There have been no changes to the EC2 instance or its configuration. Which solution will permanently resolve this issue?
A. Perform SAP kernel patching.
B. Apply a new SAP license that uses a new hardware key. Install the new key.
C. Set the SLIC_HW_VERSION Linux environment variable.
D. Reboot the EC2 instance.
A company plans to migrate its SAP NetWeaver environment from its on-premises data center to AWS. An SAP solutions architect needs to deploy the AWS resources for an SAP S/4HANA-based system in a Multi-AZ configuration without manually identifying and provisioning individual AWS resources. The SAP solutions architect's task includes the sizing, configuration, and deployment of the SAP S/4HANA system. What is the QUICKEST way to provision the SAP S/4HANA landscape on AWS to meet these requirements?
A. Use the SAP HANA Quick Start reference deployment.
B. Use AWS Launch Wizard for SAP.
C. Create AWS CloudFormation templates to automate the deployment.
D. Manually deploy SAP HANA on AWS.
A company wants to migrate its SAP environments to AWS. The SAP environments include SAP ERP Central Component (SAP ECC). SAP Business Warehouse (SAP BW), and SAP Process Integration (SAP PI) systems. As part of the migration, the company wants to do a system transformation to SAP S/4HANA. The company wants to implement SAP Fiori by using an SAP Gateway hub deployment and an internet-facing SAP Web Dispatcher for this SAP S/4HANA system only. Employees around the world will access the SAP Fiori launchpad. The company needs to allow access to only the URLs that are required for running SAP Fiori. How should an SAP security engineer design the security architecture to meet these requirements?
A. Deploy the SAP Web Dispatcher in a public subnet. Allow access to only the IP addresses that employees use to access the SAP Fiori server.
B. Deploy the SAP Web Dispatcher in a private subnet. Allow access to only the ports that are required for running SAP Fiori.
C. Deploy the SAP Web Dispatcher in a public subnet. Allow access to only the paths that are required for running SAP Fiori.
D. Deploy the SAP Web Dispatcher in a private subnet. Allow access to only the SAP S/4HANA system that serves as the SAP Fiori backend system for the SAP Gateway hub.
A company is planning to implement a new SAP workload on SUSE Linux Enterprise Server on AWS. The company needs to use AWS Key Management Service (AWS KMS) to encrypt every file at rest. The company also requires that its production SAP workloads and non-production SAP workloads are separated into different AWS accounts. The production account and the non-production account share a common SAP transport directory, /usr/sap/trans. The two accounts are connected by VPC peering. What should the company do to achieve the data encryption at rest for the new SAP workload?
A. Create an asymmetric KMS customer managed key in the production account. Create Amazon Elastic Block Store (Amazon EBS) and Amazon Elastic File System (Amazon EFS) storage for all root and SAP data. Implement encryption that uses the KMS key. Share the EFS file system from the production account with the non-production account. Import the KMS key into the non-production account to allow the production systems to access the SAP transport directory.
B. Create a symmetric KMS customer managed key in the production account. Create Amazon Elastic Block Store (Amazon EBS) and Amazon Elastic File System (Amazon EFS) storage for all root and SAP data. Implement encryption that uses the KMS key. Share the EFS file system from the production account with the non-production account. Create an IAM role in the non-production account and a key policy for the KMS key in the production account to allow the non-production systems to access the SAP transport directory.
C. Create a symmetric KMS customer managed key in the production account. Create Amazon Elastic Block Store (Amazon EBS) and Amazon Elastic File System (Amazon EFS) storage for all root and SAP data. Implement encryption that uses the KMS key. Share the EFS file system from the production account with the non-production account. Create an IAM role in the production account and a key policy for the KMS key in the production account to allow the non-production systems to access the SAP transport directory.
D. Create an asymmetric KMS customer managed key in the production account. Create Amazon Elastic Block Store (Amazon EBS) and Amazon Elastic File System (Amazon EFS) storage for all root and SAP data. Implement encryption that uses the KMS key. Share the EFS file system from the production account with the non-production account. Create an IAM role in the non-production account and a key policy for the KMS key in the production account to allow the non-production systems to access the SAP transport directory.
A global retail company is running its SAP S/4HANA workload on AWS. The company's business has grown in the past few years, and user activity has generated a significant amount or data in the SAP S/4HANA system. The company wants to expand into new geographies. Before the company finalizes the expansion plan, the company wants to perform analytics on the historical data from the past few years. The company also wants to generate sales forecasts for potential expansion locations. An SAP solutions architect must implement a solution to extract the data from SAP S/4HANA into Amazon S3. The solution also must perform the required analytics and forecasting tasks. Which solution will meet these requirements with the LEAST custom development effort?
A. Use AWS AppSync to extract the data from SAP S/4HANA and to store the data in Amazon S3. Use AWS Glue to perform analytics. Use Amazon Forecast for sales forecasts.
B. Use the SAP Landscape Transformation (LT) Replication Server SDK to extract the data, to integrate the data with SAP Data Services, and to store the data in Amazon S3. Use Amazon Athena to perform analytics. Use Amazon Forecast for sales forecasts.
C. Use Amazon AppFlow to extract the data from SAP S/4HANA and to store the data in Amazon S3. Use Amazon QuickSight to perform analytics. Use Amazon Forecast for sales forecasts.
D. Integrate AWS Glue and AWS Lambda with the SAP Operational Data Provisioning (ODP) Framework to extract the data from SAP S/4HANA and to store the data in Amazon S3. Use Amazon QuickSight to perform analytics. Use Amazon Forecast for sales forecasts.
A company uses an SAP application that runs batch jobs that are performance sensitive. The batch jobs can be restarted safely. The SAP application has six application servers. The SAP application functions reliably as long as the SAP application availability remains greater than 60%. The company wants to migrate the SAP application to AWS. The company is using a cluster with two Availability Zones. How should the company distribute the SAP application servers to maintain system reliability?
A. Distribute the SAP application servers equally across three partition placement groups.
B. Distribute the SAP application servers equally across three Availability Zones.
C. Distribute the SAP application servers equally across two Availability Zones.
D. Create an Amazon EC2 Auto Scaling group across two Availability Zones. Set a minimum capacity value of 4.
A company hosts multiple SAP applications on Amazon EC2 instances in a VPC. While monitoring the environment, the company notices that multiple port scans are attempting to connect to SAP portals inside the VPC. These port scans are originating from the same IP address block. The company must deny access to the VPC from all the offending IP addresses for the next 24 hours. Which solution will meet this requirement?
A. Modify network ACLs that are associated with all public subnets in the VPC to deny access from the IP address block.
B. Add a rule in the security group of the EC2 instances to deny access from the IP address block.
C. Create a policy in AWS Identity and Access Management (IAM) to deny access from the IP address block.
D. Configure the firewall in the operating system of the EC2 instances to deny access from the IP address block.
A company has implemented its ERP system on SAP S/4HANAon AWS. The system is based on Enqueue Standalone Architecture (ENSA2) and is highly available. As part of an availability test, the company failed over its system to secondary nodes in the second Availability Zone. When the system failed over, the initial licenses were no longer valid. What could be the reason for this behavior?
A. The company needs to apply SAP licenses after each failover.
B. The cluster configuration is not correct.
C. The company needs two separate sets of licenses for ASCS instances in each Availability Zone.
D. The company stopped and restarted the secondary node as part of the last maintenance.
A company migrated its SAP environment to AWS 6 months ago. The landscape consists of a few thousand Amazon EC2 instances for production, development, quality, and sandbox environments. The company wants to minimize the operational cost of the landscape without affecting system performance and availability. Which solutions will meet these requirements? (Choose two.)
A. Scale down the EC2 instance size for non-production environments.
B. Create an AWS Systems Manager document to automatically stop and start the SAP systems. Use Amazon CloudWatch to automate the scheduling of this task.
C. Review the billing data for the EC2 instances. Analyze the workload, and choose an EC2 Instance Savings Plan.
D. Create an AWS Systems Manager document to automatically stop and start the SAP systems and EC2 instances for non-production environments outside business hours. Use Amazon EventBridge to automate the scheduling of this task.
E. Create an AWS Systems Manager document to automatically stop and start the SAP systems and EC2 instances. Maintain the schedule in the Systems Manager document to automate this task.
An SAP solutions architect is using AWS Systems Manager Distributor to install the AWS Data Provider for SAP on production SAP application servers and SAP HANA database servers. The SAP application servers and the SAP HANA database servers are running on Red Hat Enterprise Linux. The SAP solutions architect chooses instances manually in Systems Manager Distributor and schedules installation. The installation fails with an access and authorization error related to Amazon CloudWatch and Amazon EC2 instances. There is no error related to AWS connectivity. What should the SAP solutions architect do to resolve the error?
A. Install the CloudWatch agent on the servers before installing the AWS Data Provider for SAP.
B. Download the AWS Data Provider for SAP installation package from AWS Marketplace. Use an operating system super user to install the agent manually or through a script.
C. Create an IAM role. Attach the appropriate policy to the role. Attach the role to the appropriate EC2 instances.
D. Wait until Systems Manager Agent is fully installed and ready to use on the EC2 instances. Use Systems Manager Patch Manager to perform the installation.
A company is planning to migrate its on-premises SAP ERP Central Component (SAP ECC) system on SAP HANA to AWS. Each month, the system experiences two peaks in usage. The first peak is on the 21st day of the month when the company runs payroll. The second peak is on the last day of the month when the company processes and exports credit data. Both peak workloads are of high importance and cannot be rescheduled. The current SAP ECC system has six application servers, all of a similar size. During normal operation outside of peak usage, four application servers would suffice. Which purchasing option will meet the company’s requirements MOST cost-effectively on AWS?
A. Four Reserved Instances and two Spot Instances
B. Six On-Demand Instances
C. Six Reserved Instances
D. Four Reserved Instances and two On-Demand Instances
A company is evaluating options to migrate its on-premises SAP ERP Central Component (SAP ECC) EHP 8 system to AWS. The company does not want to make any changes to the SAP versions or database versions. The system runs on SUSE Linux Enterprise Server and SAP HANA 2.0 SPS 05. The existing on-premises system has a 1 TB database. The company has 1 Gbps or internet bandwidth available for the migration. The company must complete the migration with the least possible downtime and disruption to business. Which solution will meet these requirements?
A. Install SAP ECC EHP 8 on Amazon EC2 instances. Use the same SAP SID and kernel version that the source system uses. Install SAP HANA on EC2 instances. Use the same version of SAP HANA that the source system uses. Take a full backup of the source SAP HANA database to disk. Copy the backup by using an AWS Storage Gateway Tape Gateway. Restore the backup on the target SAP HANA instance that is running on Amazon EC2.
B. Install SAP ECC EHP 8 on Amazon EC2 instances. Use the same SAP SID and kernel version that the source system uses. Install SAP HANA on EC2 instances. Use the same version of SAP HANA that the source database uses. Establish replication at the source, and register the SAP HANA instance that is running on Amazon EC2 as secondary. After the systems are synchronized, initiate a takeover so that the SAP HANA instance that is running on Amazon EC2 becomes primary. Shut down the on-premises system. Start SAP on the EC2 instances.
C. Install SAP ECC EHP 8 on Amazon EC2 instances. Use the same SAP SID and kernel version that the source system uses. Install SAP HANA on EC2 instances. Use the same version that the source system uses. Take a full offline backup of the source SAP HANA database. Copy the backup to Amazon S3 by using the AWS CLI. Restore the backup on a target SAP HANA instance that runs on Amazon EC2. Start SAP on the EC2 instances.
D. Take an offline SAP Software Provisioning Manager export of the on-premises system. Use an AWS Storage Gateway File Gateway to transfer the export. Import the export on Amazon EC2 instances to create the target SAP system.
A company wants to improve the RPO and RTO for its SAP disaster recovery (DR) solution by running the DR solution on AWS. The company is running SAP ERP Central Component (SAP ECC) on SAP HANA. The company has set an RPO of 15 minutes and an RTO of 4 hours. The production SAP HANA database is running on a physical appliance that has x86 architecture. The appliance has 1 TB of memory, and the SAP HANA global allocation limit is set to 768 GB. The SAP application servers are running as VMs on VMware, and they store data on an NFS file system. The company does not want to change any existing SAP HANA parameters that are related to data and log backup for its on-premises systems. What should an SAP solutions architect do to meet the DR objectives MOST cost-effectively?
A. For the SAP HANA database, change the log backup frequency to 5 minutes. Move the data and log backups to Amazon S3 by using the AWS CLI or AWS DataSync. Launch the SAP HANA database. For the SAP application servers, export the VMs as AMIs by using the VM Import/Export feature from AWS. For NFS file shares /sapmnt and /usr/sap/trans, establish real-time synchronization from DataSync to Amazon Elastic File System (Amazon EFS).
B. For the SAP HANA database, change the log backup frequency to 5 minutes. Move the data and log backups to Amazon S3 by using AWS Storage Gateway File Gateway. For the SAP application servers, export the VMs as AMIs by using the VM Import/Export feature from AWS. For NFS file shares /sapmnt and /usr/sap/trans, establish real-time synchronization from AWS DataSync to Amazon Elastic File System (Amazon EFS).
C. For the SAP HANA database, SAP application servers, and NFS file shares, use CloudEndure Disaster Recovery to replicate the data continuously from on premises to AWS. Use CloudEndure Disaster Recovery to launch target instances in the event of a disaster.
D. For the SAP HANA database, use a smaller SAP certified Amazon EC2 instance. Use SAP HANA system replication with ASYNC replication mode to replicate the data continuously from on premises to AWS. For the SAP application servers, use CloudEndure Disaster Recovery for continuous data replication. For NFS file shares /sapmnt and /usr/sap/trans, establish real-time synchronization from AWS DataSync to Amazon Elastic File System (Amazon EFS).
A company is planning to migrate its SAP S/4HANAand SAP BW/4HANA workloads to AWS. The company is currently using a third-party solution to back up its SAP HANA database and application. The company wants to retire the third-party backup solution after the migration to AWS. The company needs a backup solution on AWS to manage its SAP HANA database and application backups. The solution must provide secure storage of backups and must optimize cost. Which solution will meet these requirements?
A. Use SAP HANA Studio, SAP HANA HDBSQL, and SAP HANA Cockpit to perform backups to local Amazon Elastic Block Store (Amazon EBS) volumes. Enable EBS volume encryption. Use AWS Backup to perform application backups with AMIs or snapshots to Amazon S3. Enable S3 encryption.
B. Use SAP HANA Cockpit to implement a backup policy and perform SAP HANA database backups to Amazon S3 with AWS Backint Agent for SAP HANA. Enable S3 encryption. Use AWS Backup with backup plans to perform application backups with AMIs or snapshots. Enable S3 encryption.
C. Use AWS Backup with backup plans to perform SAP HANA database backups to Amazon S3 with AWS Backint Agent for SAP HANA. Enable S3 encryption. Use AWS Backup with backup plans to perform application backups with AMIs or snapshots. Enable S3 encryption.
D. Use SAP HANA Studio, SAP HANA HDBSQL, and SAP HANA Cockpit to perform backups to local Amazon Elastic Block Store (Amazon EBS) volumes. Copy the backups to Amazon S3. Use AWS Backup to schedule application backups with AMIs or snapshots to Amazon S3.
A company has a 48 TB SAP application that runs on premises and uses an IBM Db2 database. The company needs to migrate the application to AWS. The company has strict uptime requirements for the application with maximum downtime of 24 hours each weekend. The company has established a 1 Gbps AWS Direct Connect connection but can spare bandwidth for migration only during non-business hours or weekends. How can the company meet these requirements to migrate the application to AWS?
A. Use SAP Software Provisioning Manager to create an export of the data. Move this export to AWS during a weekend by using the Direct Connect connection. On AWS. import the data into the target SAP application. Perform the cutover.
B. Set up database replication from on premises to AWS. On the day of downtime, ensure that the replication finishes. Perform cutover to AWS.
C. Use an AWS Snowball Edge Storage Optimized device to send an initial backup to AWS. Capture incremental backups daily. When the initial backup is on AWS, perform database restore from the initial backup and keep applying incremental backups. On the day of cutover, perform the final incremental backup. Perform cutover to AWS.
D. Use AWS Application Migration Service (CloudEndure Migration) to migrate the database to AWS. On the day of cutover, switch the application to run on AWS servers.
A company is planning to migrate its on-premises SAP application to AWS. The application runs on VMware vSphere. The SAP ERP Central Component (SAP ECC) server runs on an IBM Db2 database that is 2 TB in size. The company wants to migrate the database to SAP HANA. Which migration strategy will meet these requirements?
A. Use AWS Application Migration Service (CloudEndure Migration).
B. Use SAP Software Update Manager (SUM) Database Migration Option (DMO) with System Move.
C. Use AWS Server Migration Service (AWS SMS).
D. Use AWS Database Migration Service (AWS DMS).
A company’s SAP basis team is responsible for database backups in Amazon S3. The company frequently needs to restore the last 3 months of backups into the pre-production SAP system to perform tests and analyze performance. Previously, an employee accidentally deleted backup files from the S3 bucket. The SAP basis team wants to prevent accidental deletion of backup files in the future. Which solution will meet these requirements?
A. Create a new resource-based policy that prevents deletion of the S3 bucket.
B. Enable versioning and multi-factor authentication (MFA) on the S3 bucket.
C. Create signed cookies for the backup files in the S3 bucket. Provide the signed cookies to authorized users only.
D. Apply an S3 Lifecycle policy to move the backup files immediately to S3 Glacier.
A company is migrating its SAP S/4HANA landscape from on premises to AWS. An SAP solutions architect is designing a backup solution for the SAP S/4HANA landscape on AWS. The backup solution will use AWS Backint Agent for SAP HANA (AWS Backint agent) to store backups in Amazon S3. The company's backup policy for source systems requires a retention period of 150 days for weekly full online backups. The backup policy requires a retention period of 30 days for daily transaction log backups. The company must keep the same backup policy on AWS while maximizing data resiliency. The company needs the ability to retrieve the backup data one or two times each year within 10 hours of the retrieval request. The SAP solutions architect must configure AWS Backint agent and S3 Lifecycle rules according to these parameters. Which solution will meet these requirements MOST cost-effectively?
A. Configure the target S3 bucket to use S3 Glacier Deep Archive for the backup files. Create S3 Lifecycle rules on the S3 bucket to delete full online backup files that are older than 150 days and to delete log backup files that are older than 30 days.
B. Configure the target S3 bucket to use S3 Standard storage for the backup files. Create an S3 Lifecycle rule on the S3 bucket to move all the backup files to S3 Glacier Instant Retrieval. Create additional S3 Lifecycle rules to delete full online backup files that are older than 150 days and to delete log backup files that are older than 30 days.
C. Configure the target S3 bucket to use S3 One Zone-Infrequent Access (S3 One Zone-IA) for the backup files. Create S3 Lifecycle rules on the S3 bucket to move full online backup files that are older than 30 days to S3 Glacier Flexible Retrieval and to delete log backup files that are older than 30 days. Create an additional S3 Lifecycle rule to delete full online backup files that are older than 150 days.
D. Configure the target S3 bucket to use S3 Standard-Infrequent Access (S3 Standard-IA) for the backup files. Create S3 Lifecycle rules on the S3 bucket to move full online backup files that are older than 30 days to S3 Glacier Flexible Retrieval and to delete log backup files that are older than 30 days. Create an additional S3 Lifecycle rule to delete full online backup files that are older than 150 days.
A company wants to migrate its SAP landscape from on premises to AWS. What are the MINIMUM requirements that the company must meet to ensure full support of SAP on AWS? (Choose three.)
A. Enable detailed monitoring for Amazon CloudWatch on each instance in the landscape.
B. Deploy the infrastructure by using SAP Cloud Appliance Library.
C. Install, configure, and run the AWS Data Provider for SAP on each instance in the landscape.
D. Protect all production instances by using Amazon EC2 automatic recovery.
E. Deploy the infrastructure for the SAP landscape by using AWS Launch Wizard for SAP.
F. Deploy the SAP landscape on an AWS account that has either an AWS Business Support plan or an AWS Enterprise Support plan.
An SAP specialist is building an SAP environment. The SAP environment contains Amazon EC2 instances that run in a private subnet in a VPC. The VPC includes a NAT gateway. The SAP specialist is setting up IBM Db2 high availability disaster recovery for the SAP cluster. After configuration of overlay IP address routing, traffic is not routing to the database EC2 instances. What should the SAP specialist do to resolve this issue?
A. Open a security group for SAP ports to allow traffic on port 443.
B. Create route table entries to allow traffic from the database EC2 instances to the NAT gateway.
C. Turn off the source/destination check for the database EC2 instances.
D. Create an IAM role that has permission to access network traffic. Associate the role with the database EC2 instances.
A company is planning to move its on-premises SAP HANA database to AWS. The company needs to migrate this environment to AWS as quickly as possible. An SAP solutions architect will use AWS Launch Wizard for SAP to deploy this SAP HANA workload. Which combination of steps should the SAP solutions architect follow to start the deployment of this workload on AWS? (Choose three.)
A. Download the SAP HANA software.
B. Download the AWS CloudFormation template for the SAP HANA deployment.
C. Download and extract the SAP HANA software. Upload the SAP HANA software to an FTP server that Launch Wizard can access.
D. Upload the unextracted SAP HANA software to an Amazon S3 destination bucket. Follow the S3 file path syntax for the software in accordance with Launch Wizard recommendations.
E. Bring the operating system AMI by using the Bring Your Own Image (BYOI) model, or purchase the subscription for the operating system AMI from AWS Marketplace.
F. Create the SAP file system by using Amazon Elastic Block Store (Amazon EBS) before the deployment.
An SAP consultant is planning a migration of an on-premises SAP landscape to AWS. The landscape includes databases from Oracle, IBM Db2, and Microsoft SQL Server. The system copy procedure accesses the copied data on the destination system to complete the copy. Which password must the SAP consultant obtain from the source system before the SAP consultant initiates the export or backup?
A. The password of the adm operating system user
B. The password of the SAP* user in client 000
C. The password of the administrator user of the database
D. The password of the DDIC user in client 000
A company runs its SAP Business Suite on SAP HANA systems on AWS. The company's production SAP ERP Central Component (SAP ECC) system uses an x1e.32xlarge (memory optimized) Amazon EC2 instance and is 3.5 TB in size. Because of expected future growth, the company needs to resize the production system to use a u-* EC2 High Memory instance. The company must resize the system as quickly as possible and must minimize downtime during the resize activities. Which solution will meet these requirements?
A. Resize the instance by using the AWS Management Console or the AWS CLI.
B. Create an AMI of the source system Launch a new EC2 High Memory instance that is based on that AMI.
C. Launch a new EC2 High Memory instance. Install and configure SAP HANA on the new instance by using AWS Launch Wizard for SAP. Use SAP HANA system replication to migrate the data to the new instance.
D. Launch a new EC2 High Memory instance. Install and configure SAP HANA on the new instance by using AWS Launch Wizard for SAP. Use SAP HANA backup and restore to back up the source system directly to Amazon S3 and to migrate the data to the new instance.
An SAP engineer is configuring AWS Backint Agent for SAP HANA (AWS Backint agent) for an SAP HANA database that is running on an Amazon EC2 instance. After the configuration, the backups fail. During investigation, the SAP engineer notices that the AWS Backint agent logs contain numerous AccessDenied messages. Which actions should the SAP engineer take to resolve this issue? (Choose two.)
A. Update the EC2 role permissions to allow S3 bucket access.
B. Verify that the configuration file has the correct formatting of the S3BucketOwnerAccountID.
C. Install AWS Systems Manager Agent (SSM Agent) correctly by using the sudo command.
D. Install the correct version of Python for AWS Backint agent.
E. Add the execute permission to the AWS Backint agent binary.
A company wants to migrate its SAP workloads to AWS from another cloud provider. The company’s landscape consists of SAP S/4HANA, SAP BW/4HANA, SAP Solution Manager, and SAP Web Dispatcher. SAP Solution Manager is running on SAP HANA. The company wants to change the operating system from SUSE Linux Enterprise Server to Red Hat Enterprise Linux as a part of this migration. The company needs a solution that results in the least possible downtime for the SAP S/4HANA and SAP BW/4HANA systems. Which migration solution will meet these requirements?
A. Use SAP Software Provisioning Manager to perform a system export/import for SAP S/4HANA, SAP BW/4HANA, SAP Solution Manager, and SAP Web Dispatcher.
B. Use backup and restore for SAP S/4HANA, SAP BW/4HANA, and SAP Solution Manager. Reinstall SAP Web Dispatcher on AWS with the necessary configuration.
C. Use backup and restore for SAP S/4HANA and SAP BW/4HANA. Use SAP Software Provisioning Manager to perform a system export/import for SAP Solution Manager. Reinstall SAP Web Dispatcher on AWS with the necessary configuration.
D. Use SAP HANA system replication to replicate the data between the source system and the target AWS system for SAP S/4HANA and SAP BW/4HANA. Use SAP Software Provisioning Manager to perform a system export/import for SAP Solution Manager. Reinstall SAP Web Dispatcher on AWS with the necessary configuration.
A company is planning to deploy SAP HANA on AWS. The block storage that hosts the SAP HANA data volume must have at least 64,000 IOPS per volume and must have a maximum throughput of at least 500 MiB/s per volume. Which Amazon Elastic Block Store (Amazon EBS) volume meets these requirements?
A. General Purpose SSD (gp2) EBS volume
B. General Purpose SSD (gp3) EBS volume
C. Provisioned IOPS SSD (io2) EBS volume
D. Throughput Optimized HDD (st1) EBS volume
A company needs to migrate its SAP HANA landscape from an on-premises data center to AWS. The company's existing SAP HANA database instance is oversized. The company must resize the database instance as part of the migration. Which combination of steps should the company take to ensure that the target Amazon EC2 instance is sized optimally for the SAP HANA database instance? (Choose two.)
A. Determine the peak memory utilization of the existing on-premises SAP HANA system.
B. Determine the average memory utilization of the existing on-premises SAP HANA system.
C. For the target system, select any SAP certified EC2 instance that provides more memory than the current average memory utilization.
D. For the target system, select the smallest SAP certified EC2 instance that provides more memory than the current peak memory utilization.
E. For the target system, select any current-generation EC2 memory optimized instance.
A company is running an SAP HANA database on AWS. The company wants to manage historical, infrequently accessed warm data for a native SAP HANA use case. An SAP solutions architect needs to recommend a solution that can provide online data storage in extended store, available for queries and updates. The solution must be an integrated component of the SAP HANA database and must allow the storage of up to five times more data in the warm tier than in the hot tier. Which solution will meet these requirements?
A. Use Amazon Data Lifecycle Manager (Amazon DLM) with SAP Data Hub to move data in and out of the SAP HANA database to Amazon S3.
B. Use an SAP HANA extension node.
C. Use SAP HANA dynamic tiering as an optional add-on to the SAP HANA database.
D. Use Amazon Data Lifecycle Manager (Amazon DLM) with SAP HANA spark controller so that SAP HANA can access the data through the Spark SQL SDA adapter.
A company's SAP solutions architect is configuring a network architecture for an SAP HANA multi-node environment. The company requires isolation of the logical network zones: client, internal, and storage. The database runs on X1 (memory optimized) Amazon EC2 instances and uses Amazon Elastic Block Store (Amazon EBS) volumes for persistent storage. Which combination of actions will provide the required isolation? (Choose three.)
A. Attach an AWS Network Firewall policy for each zone to the subnet for the node cluster.
B. Attach a secondary elastic network interface to each instance for the internal communications between nodes.
C. Attach a secondary elastic network interface to each instance for the storage communications.
D. Configure a security group with rules that allow only TCP connections within the security group on the ports that are assigned for the internal network connections. Associate the security group with the appropriate elastic network interface on each instance.
E. Configure a security group with rules that allow only TCP connections with the external customer network on the ports that are assigned for the client connections. Associate the security group with the appropriate elastic network interface.
F. Configure a security group with rules that allow Non-Volatile Memory Express (NVMe) connections within the subnet range. Associate the security group with the appropriate elastic network interface on each instance.
A company is running its production SAP HANA system on AWS. The SAP HANA system is hosted on an Amazon EC2 instance that runs SUSE Linux Enterprise Server 12. The operating system patch version is out of date, and SAP has identified some critical security vulnerabilities. SUSE publishes a critical patch update that requires a system restart to fix the issue. The company must apply this patch along with many prerequisite patches. Which solution will meet these requirements with the LEAST system downtime?
A. Use the SUSE Linux Enterprise Server patching update process and SUSE tools to apply the required patches to the existing instance.
B. Use AWS Systems Manager Patch Manager to automatically apply the patches to the existing instance.
C. Use AWS Launch Wizard for SAP to provision a second SAP HANA instance with an AMI that contains the required patches. Use SAP HANA system replication to copy the data from the original SAP HANA instance to the newly launched SAP HANA instance. Perform SAP HANA system replication takeover.
D. Use AWS Launch Wizard for SAP to provision a second SAP HANA instance with an AMI that contains the required patches. Use SAP HANA native backup and restore to copy the data from the original SAP HANA instance to the newly launched SAP HANA instance.
A company migrated its SAP ERP Central Component (SAP ECC) environment to an m4.large Amazon EC2 instance (Xen based) in 2016. The company changed the instance type to m5.xlarge (KVM based). Since the change, users are receiving a pop-up box that indicates that the SAP license will expire soon. What could be the cause of this issue?
A. The change from the Xen-based m4.large instance type to the KVM-based m5.xlarge instance type is not allowed.
B. The Xen-based m4.large instance was running with a lower kernel patch level (SAP Kernel 7.49 Patch Level 401). When the change to a KVM-based instance occurred, the hardware key changed. The instance requires a new license.
C. The Xen-based m4.large instance was running with a higher kernel patch level (SAP Kernel 7.49 Patch Level 500). When the change to a KVM-based instance occurred, the hardware key changed. The instance requires a new license.
D. Whenever an instance type changes, the change requires a new license.
A company wants to migrate its SAP S/4HANA software from on premises to AWS in a few weeks. An SAP solutions architect plans to use AWS Launch Wizard for SAP to automate the SAP deployment on AWS. Which combination of steps must the SAP solutions architect take to use Launch Wizard to meet these requirements? (Choose two.)
A. Download the SAP software files from the SAP Support Portal. Upload the SAP software files to Amazon S3. Provide the S3 bucket path as an input to Launch Wizard.
B. Provide the SAP S-user ID and password as inputs to Launch Wizard to download the software automatically.
C. Format the S3 file path syntax according to the Launch Wizard deployment recommendation.
D. Use an AWS CloudFormation template for the automated deployment of the SAP landscape.
E. Provision Amazon EC2 instances. Tag the instances to install SAP S/4HANA on them.
A company is preparing a greenfield deployment of SAP S/4HANA on AWS. The company wants to ensure that this new SAP S/4HANA landscape is fully supported by SAP. The company's SAP solutions architect needs to set up a new SAProuter connection directly to SAP from the company's landscape within the VPC. Which combination of steps must the SAP solutions architect take to accomplish this goal? (Choose three.)
A. Launch the instance that the SAProuter software will be installed on into a private subnet of the VPC. Assign the instance an Elastic IP address.
B. Launch the instance that the SAProuter software will be installed on into a public subnet of the VPC. Assign the VPC an Elastic IP address.
C. Launch the instance that the SAProuter software will be installed on into a public subnet of the VPAssign the instance an overlay IP address.
D. Create a specific security group for the SAProuter instance. Configure rules to allow the required inbound and outbound access to the SAP support network. Include a rule that allows inbound traffic to TCP port 3299.
E. Create a specific security group for the SAProuter instance. Configure rules to allow the required inbound and outbound access to the SAP support network. Include a rule that denies inbound traffic to TCP port 3299.
F. Use a Secure Network Communication (SNC) internet connection.
A global retail company is running its SAP landscape on AWS. Recently, the company made changes to its SAP Web Dispatcher architecture. The company added an additional SAP Web Dispatcher for high availability with an Application Load Balancer (ALB) to balance the load between the two SAP Web Dispatchers. When users try to access SAP through the ALB, the system is reachable. However, the SAP backend system is showing an error message. An investigation reveals that the issue is related to SAP session handling and distribution of requests. The company confirmed that the system was working as expected with one SAP Web Dispatcher. The company replicated the configuration of that SAP Web Dispatcher to the new SAP Web Dispatcher. How can the company resolve the error?
A. Maintain persistence by using session cookies. Enable session stickiness (session affinity) on the SAP Web Dispatchers by setting the wdisp/HTTP/esid_support parameter to True.
B. Maintain persistence by using session cookies. Enable session stickiness (session affinity) on the ALB.
C. Turn on host-based routing on the ALB to route traffic between the SAP Web Dispatchers.
D. Turn on URL-based routing on the ALB to route traffic to the application based on URL.
A company wants to implement SAP HANA on AWS with the Multi-AZ deployment option by using AWS Launch Wizard for SAP. The solution will use SUSE Linux Enterprise High Availability Extension for the high availability deployment. An SAP solutions architect must ensure that all the prerequisites are met. The SAP solutions architect also must ensure that the user inputs to start the guided deployment of Launch Wizard are valid. Which combination of steps should the SAP solutions architect take to meet these requirements? (Choose two.)
A. Before starting the Launch Wizard deployment, create the underlying Amazon Elastic Block Store (Amazon EBS) volume types to use for SAP HANA data and log volumes based on the performance requirements.
B. Use a value for the PaceMakerTag parameter that is not used by any other Amazon EC2 instances in the AWS Region where the system is being deployed.
C. Ensure that the virtual hostname for the SAP HANA database that is used for the SUSE Linux Enterprise High Availability Extension configuration is not used in any other deployed accounts.
D. Ensure that the VirtuallPAddress parameter is outside the VPC CIDR and is not being used in the route table that is associated with the subnets where primary and secondary SAP HANA instances will be deployed.
E. Before starting the Launch Wizard deployment, set up the SUSE Linux Enterprise High Availability Extension network configuration and security group.
A company is planning to migrate its on-premises SAP applications to AWS. The applications are based on Windows operating systems. A file share stores the transport directories and third-party application data on the network-attached storage of the company’s on-premises data center. The company’s plan is to lift and shift the SAP applications and the file share to AWS. The company must follow AWS best practices for the migration. Which AWS service should the company use to host the transport directories and third-party application data on AWS?
A. Amazon Elastic Block Store (Amazon EBS)
B. AWS Storage Gateway
C. Amazon Elastic File System (Amazon EFS)
D. Amazon FSx for Windows File Server
A company hosts its SAP NetWeaver workload on SAP HANA in the AWS Cloud. The SAP NetWeaver application is protected by a cluster solution that uses Red Hat Enterprise Linux. High Availability Add-On. The cluster solution uses an overlay IP address to ensure that the high availability cluster is still accessible during failover scenarios. An SAP solutions architect needs to facilitate the network connection to this overlay IP address from multiple locations. These locations include more than 25 VPCs, other AWS Regions, and the on-premises environment. The company already has set up an AWS Direct Connect connection between the on-premises environment and AWS. What should the SAP solutions architect do to meet these requirements in the MOST scalable manner?
A. Use VPC peering between the VPCs to route traffic between them.
B. Use AWS Transit Gateway to connect the VPCs and on-premises networks together.
C. Use a Network Load Balancer to route connections to various targets within VPCs.
D. Deploy a Direct Connect gateway to connect the Direct Connect connection over a private VIF to one or more VPCs in any accounts.
An SAP database analyst installs AWS Backint Agent for SAP HANA (AWS Backint agent) by using AWS Systems Manager. The SAP database analyst runs an initial test to perform a database backup for a 512 GB SAP HANA database. The database runs on an SAP certified Amazon EC2 instance type with General Purpose SSD (gp2) Amazon Elastic Block Store (Amazon EBS) volumes for all disk storage. The backup is running too slowly. Which actions should the SAP database analyst take to improve the performance of AWS Backint agent? (Choose two.)
A. Set the parallel_data_backup_backint_channels parameter to a number greater than 1.
B. Select a Provisioned IOPS SSD (io2) volume as the backup target for AWS Backint agent.
C. Delete unnecessary older backup files from backups that SAP Backint agent performed.
D. Change the existing gp2-based SAP HANA data volumes to the Provisioned IOPS SSD (io2) EBS volume type.
E. Reinstall AWS Backint agent by using the AWS Backint installer rather than the Systems Manager document.
A company is using a multi-account strategy for SAP HANA and SAP BW/4HANA instances across development, QA, and production systems in the same AWS Region. Each system is hosted in its own VPC. The company needs to establish cross-VPC communication between the SAP systems. The company might add more SAP systems in the future. The company must create connectivity across the SAP systems and hundreds of AWS accounts. The solution must maximize scalability and reliability. Which solution will meet these requirements?
A. Create an AWS Transit Gateway in a central networking account. Attach the transit gateway to the AWS accounts. Set up routing and a network ACL to establish communication.
B. Set up VPC peering between the accounts. Configure routing in each VPC to use the VPC peering links.
C. Create a transit VPC that uses the hub-and-spoke model. Set up routing to use the transit VPC for communication between the SAP systems.
D. Create a VPC link for each SAP system. Use the VPC links to connect the SAP systems.
A company decides to deploy SAP non-production systems on AWS by using the standard installation model in a single Availability Zone. The company will use Amazon Elastic File System (Amazon EFS) to host SAP file systems such as /sapmnt and /usr/sap/trans. The company launches the required Amazon EC2 instances to host these systems. However, the company cannot mount the EFS file systems to the respective EC2 instances. An SAP engineer needs to adjust the security groups that are assigned to the EC2 instances and EFS file systems to allow traffic between the EC2 instances and the EFS file systems. Which combination of steps should the SAP engineer take to meet these requirements? (Choose two.)
A. Configure the security groups that are associated with the EFS file systems to allow inbound access for the TCP protocol on the NFS port (TCP 2049) from all EC2 instances where the file systems are mounted.
B. Configure the security groups that are associated with the EFS file systems to allow outbound access for the TCP protocol on the NFS port (TCP 2049) from all EC2 instances where the file systems are mounted.
C. Configure the security groups that are associated with the EFS file systems to allow outbound access from the security group of the corresponding EC2 instances on the NFS port (TCP 2049).
D. Configure the security groups that are associated with the EC2 instances to allow inbound access to the EFS file systems on the NFS port (TCP 2049).
E. Configure the security groups that are associated with the EC2 instances to allow outbound access to the EFS file systems on the NFS port (TCP 2049).
An SAP technology consultant needs to scale up a primary application server (PAS) instance. The PAS currently runs on a c5a.xlarge Amazon EC2 instance. The SAP technology consultant needs to change the instance type to c5a.2xlarge. How can the SAP technology consultant meet this requirement?
A. Stop the complete SAP system. Stop the EC2 instance. Use the AWS Management Console or the AWS CLI to change the instance type. Start the EC2 instance. Start the complete SAP system.
B. While SAP is running, use the AWS Management Console or the AWS CLI to change the instance type without stopping the EC2 instance.
C. Stop the complete SAP system. Terminate the EC2 instance. Use the AWS Management Console or the AWS CLI to change the instance type. Start the EC2 instance. Start the complete SAP system.
D. While SAP is running, log in to the EC2 instance. Run the following AWS CLI command: aws ec2 modify-instance-attribute –instance-id –instance-type “{“Value”: “c5a.2xlargel”}”.
Free Access Full PAS-C01 Practice Test Free Questions
If you’re looking for more PAS-C01 practice test free questions, click here to access the full PAS-C01 practice test.
We regularly update this page with new practice questions, so be sure to check back frequently.
Good luck with your PAS-C01 certification journey!