Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
  • Login
  • Register
Quesions Library
  • Cisco
    • 200-301
    • 200-901
      • Multiple Choice
      • Drag Drop
    • 350-401
      • Multiple Choice
      • Drag Drop
    • 350-701
    • 300-410
      • Multiple Choice
      • Drag Drop
    • 300-415
      • Multiple Choice
      • Drag Drop
    • 300-425
    • Others
  • AWS
    • CLF-C02
    • SAA-C03
    • SAP-C02
    • ANS-C01
    • Others
  • Microsoft
    • AZ-104
    • AZ-204
    • AZ-305
    • AZ-900
    • AI-900
    • SC-900
    • Others
  • CompTIA
    • SY0-601
    • N10-008
    • 220-1101
    • 220-1102
    • Others
  • Google
    • Associate Cloud Engineer
    • Professional Cloud Architect
    • Professional Cloud DevOps Engineer
    • Others
  • ISACA
    • CISM
    • CRIS
    • Others
  • LPI
    • 101-500
    • 102-500
    • 201-450
    • 202-450
  • Fortinet
    • NSE4_FGT-7.2
  • VMware
  • >>
    • Juniper
    • EC-Council
      • 312-50v12
    • ISC
      • CISSP
    • PMI
      • PMP
    • Palo Alto Networks
    • RedHat
    • Oracle
    • GIAC
    • F5
    • ITILF
    • Salesforce
Contribute
Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
Practice Test Free
No Result
View All Result
Home Practice Questions Free

AI-100 Practice Questions Free

Table of Contents

Toggle
  • AI-100 Practice Questions Free – 50 Exam-Style Questions to Sharpen Your Skills
  • Free Access Full AI-100 Practice Questions Free

AI-100 Practice Questions Free – 50 Exam-Style Questions to Sharpen Your Skills

Are you preparing for the AI-100 certification exam? Kickstart your success with our AI-100 Practice Questions Free – a carefully selected set of 50 real exam-style questions to help you test your knowledge and identify areas for improvement.

Practicing with AI-100 practice questions free gives you a powerful edge by allowing you to:

  • Understand the exam structure and question formats
  • Discover your strong and weak areas
  • Build the confidence you need for test day success

Below, you will find 50 free AI-100 practice questions designed to match the real exam in both difficulty and topic coverage. They’re ideal for self-assessment or final review. You can click on each Question to explore the details.

Question 1

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several AI models in Azure Machine Learning Studio.
You deploy the models to a production environment.
You need to monitor the compute performance of the models.
Solution: You enable Model data collection.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection

Question 2

You plan to deploy Azure IoT Edge devices that will each store more than 10,000 images locally and classify the images by using a Custom Vision Service classifier.
Each image is approximately 5 MB.
You need to ensure that the images persist on the devices for 14 days.
What should you use?

A. The device cache

B. Azure Blob storage on the IoT Edge devices

C. Azure Stream Analytics on the IoT Esge devices

D. Microsoft SQL Server on the IoT Edge devices

 


Suggested Answer: B

References:
https://docs.microsoft.com/en-us/azure/iot-edge/how-to-store-data-blob

Question 3

HOTSPOT -
You plan to create a bot that will support five languages. The bot will be used by users located in three different countries. The bot will answer common customer questions. The bot will use Language Understanding (LUIS) to identify which skill to use and to detect the language of the customer.
You need to identify the minimum number of Azure resources that must be created for the planned bot.
How many QnA Maker, LUIS and Text Analytics instances should you create? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

QnA Maker: 5 –
If the user plans to support multiple languages, they need to have a new QnA Maker resource for each language.
LUIS: 5 –
If you need a multi-language LUIS client application such as a chatbot, you have a few options. If LUIS supports all the languages, you develop a LUIS app for each language. Each LUIS app has a unique app ID, and endpoint log. If you need to provide language understanding for a language LUIS does not support, you can use Microsoft Translator API to translate the utterance into a supported language, submit the utterance to the LUIS endpoint, and receive the resulting scores.
Language detection: 1 –
The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each document and returns language identifiers with a score that indicates the strength of the analysis.
This capability is useful for content stores that collect arbitrary text, where language is unknown. You can parse the results of this analysis to determine which language is used in the input document. The response also returns a score that reflects the confidence of the model. The score value is between 0 and 1.
The Language Detection feature can detect a wide range of languages, variants, dialects, and some regional or cultural languages. The exact list of languages for this feature isn’t published.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-language-detection

Question 4

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy an Azure Machine Learning model as an IoT Edge module.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT Edge devices. For example, you can deploy an Azure
Machine Learning module that predicts when a device fails based on simulated machine temperature data.
References:
https://docs.microsoft.com/bs-latn-ba/azure/iot-edge/tutorial-deploy-machine-learning

Question 5

DRAG DROP -
You are designing an AI solution that will use IoT devices to gather data from conference attendees, and then later analyze the data. The IoT devices will connect to an Azure IoT hub.
You need to design a solution to anonymize the data before the data is sent to the IoT hub.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

Step 1: Create a storage container
ASA Edge jobs run in containers deployed to Azure IoT Edge devices.
Step 2: Create an Azure Stream Analytics Edge Job
Azure Stream Analytics (ASA) on IoT Edge empowers developers to deploy near-real-time analytical intelligence closer to IoT devices so that they can unlock the full value of device-generated data.
Scenario overview:
Reference Image
Step 3: Add the job to the IoT devices in IoT
References: alt=”Reference Image” />
Step 3: Add the job to the IoT devices in IoT
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-edge

Question 6

HOTSPOT -
You are designing an application to parse images of business forms and upload the data to a database. The upload process will occur once a week.
You need to recommend which services to use for the application. The solution must minimize infrastructure costs.
Which services should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: Azure Cognitive Services –
Azure Cognitive Services include image-processing algorithms to smartly identify, caption, index, and moderate your pictures and videos.
Not: Azure Linguistic Analytics API, which provides advanced natural language processing over raw text.
Box 2: Azure Data Factory –
The Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources. It is a platform somewhat like SSIS in the cloud to manage the data you have both on-prem and in the cloud.
It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables) and Azure SQL Database.
Reference:
https://azure.microsoft.com/en-us/services/cognitive-services/
https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/

Question 7

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.
You need to monitor the scoring accuracy of each run of the model.
Solution: You modify the scoring file.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

 

Question 8

You plan to implement a new data warehouse for a planned AI solution.
You have the following information regarding the data warehouse:
✑ The data files will be available in one week.
✑ Most queries that will be executed against the data warehouse will be ad-hoc queries.
✑ The schemas of data files that will be loaded to the data warehouse will change often.
✑ One month after the planned implementation, the data warehouse will contain 15 TB of data.
You need to recommend a database solution to support the planned implementation.
What two solutions should you include in the recommendation? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.

A. Apache Hadoop

B. Apache Spark

C. A Microsoft Azure SQL database

D. An Azure virtual machine that runs Microsoft SQL Server

 


Suggested Answer: AB

 

Question 9

You have Azure IoT Edge devices that collect measurements every 30 seconds.
You plan to send the measurements to an Azure IoT hub.
You need to ensure that every event is processed as quickly as possible.
What should you use?

A. Apache Kafka

B. Azure Stream Analytics record functions

C. Azure Stream Analytics windowing functions

D. Azure Machine Learning on the IoT Edge devices

 


Suggested Answer: D

Use Azure Notebooks to develop a machine learning module and deploy it to a Linux device running Azure IoT Edge.
You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT Edge devices.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-machine-learning

Question 10

You need to meet the greeting requirements for Butler.
Which type of authentication should you use?

A. AdaptiveCard

B. SigninCard

C. CardCarousel

D. HeroCard

 


Suggested Answer: D

Scenario: Butler must greet users by name when they first connect.
HeroCard defines a card with a large image, title, text, and action buttons.
Incorrect Answers:
B: SigninCard defines a card that lets a user sign in to a service.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-send-welcome-message

Question 11

You are designing an AI application that will perform real-time processing by using Microsoft Azure Stream Analytics.
You need to identify the valid outputs of a Stream Analytics job.
What are three possible outputs? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. A Hive table in Azure HDInsight

B. Azure SQL Database

C. Azure Cosmos DB

D. Azure Blob storage

E. Azure Redis Cache

 


Suggested Answer: BCD

References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs

Question 12

HOTSPOT -
You have an app that uses the Language Understanding (LUIS) API as shown in the following exhibit.
 Image
Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: train –
Utterances are input from the user that your app needs to interpret. To train LUIS to extract intents and entities from them, it’s important to capture a variety of different example utterances for each intent. Active learning, or the process of continuing to train on new utterances, is essential to machine-learned intelligence that LUIS provides.
Box 2: creating intents –
Each intent needs to have example utterances, at least 15. If you have an intent that does not have any example utterances, you will not be able to train LUIS. If you have an intent with one or very few example utterances, LUIS will not accurately predict the intent.
Box 3: never published –
In each iteration of the model, do not add a large quantity of utterances. Add utterances in quantities of 15. Train, publish, and test again.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-utteran3ce

Question 13

You need to evaluate trends in fuel prices during a period of 10 years. The solution must identify unusual fluctuations in prices and produce visual representations.
Which Azure Cognitive Services API should you use?

A. Anomaly Detector

B. Computer Vision

C. Text Analytics

D. Bing Autosuggest

 


Suggested Answer: A

The Anomaly Detector API enables you to monitor and detect abnormalities in your time series data with machine learning. The Anomaly Detector API adapts by automatically identifying and applying the best-fitting models to your data, regardless of industry, scenario, or data volume. Using your time series data, the API determines boundaries for anomaly detection, expected values, and which data points are anomalies.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/overview

Question 14

You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will be used for purchase order data. Stream2 will be used for reference data.
The reference data is stored in CSV files.
You need to recommend an ingestion solution for each data stream.
What two solutions should you recommend? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.

A. an Azure event hub for Stream1 and Azure Blob storage for Stream2

B. Azure Blob storage for Stream1 and Stream2

C. an Azure event hub for Stream1 and Stream2

D. Azure Blob storage for Stream1 and Azure Cosmos DB for Stream2

E. Azure Cosmos DB for Stream1 and an Azure event hub for Stream2

 


Suggested Answer: AB

Stream1 – Azure Event –
Stream2 – Blob Storage –
Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second.
Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios.
Stream1, Stream2 – Blob Storage –
Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources:
Azure Event Hubs –
Azure IoT Hub –
Azure Blob storage –
These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion

Question 15

You company's developers have created an Azure Data Factory pipeline that moves data from an on-premises server to Azure Storage. The pipeline consumes
Azure Cognitive Services APIs.
You need to deploy the pipeline. Your solution must minimize custom code.
You use Integration Runtime to move data to the cloud and Azure API Management to consume Cognitive Services APIs.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

Azure API Management is a turnkey solution for publishing APIs to external and internal customers.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-examples-and-scenarios

Question 16

The development team at your company builds a bot by using C# and .NET.
You need to deploy the bot to Azure.
Which tool should you use?

A. the .NET Core CLI

B. the Azure CLI

C. the Git CLI

D. the AzCopy toll

 


Suggested Answer: B

The deployment process documented here uses one of the ARM templates to provision required resources for the bot in Azure by using the Azure CLI.
Note: When you create a bot using the Visual Studio template, Yeoman template, or Cookiecutter template the source code generated includes a deploymentTemplates folder that contains ARM templates.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-deploy-az-cli

Question 17

You are developing a Microsoft Bot Framework application. The application consumes structured NoSQL data that must be stored in the cloud.
You implement Azure Blob storage for the application. You want access to the blob store to be controlled by using a role.
You implement On-premises Active Directory Domain Services (AD DS).
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

On-premises Active Directory Domain Services (AD DS) authorization is no supported for Azure Blob storage
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 18

You create an Azure Cognitive Services resource.
A developer needs to be able to retrieve the keys used by the resource. The solution must use the principle of least privilege.
What is the best role to assign to the developer?

A. Security Manager

B. Security Reader

C. Cognitive Services Contributor

D. Cognitive Services User

 


Suggested Answer: D

The Cognitive Services User lets you read and list keys of Cognitive Services.
Reference:
https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles

Question 19

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure SQL database, an Azure Data Lake Storage Gen 2 account, and an API developed by using Azure Machine Learning Studio.
You need to ingest data once daily from the database, score each row by using the API, and write the data to the storage account.
Solution: You create an Azure Data Factory pipeline that contains the Machine Learning Batch Execution activity.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

Using the Batch Execution Activity in an Azure Data Factory pipeline, you can invoke an Azure Machine Learning Studio (classic) web service to make predictions on the data in batch
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/transform-data-using-machine-learning

Question 20

You are designing a business application that will use Azure Cognitive Services to parse images of business forms. You have the following requirements:
Parsed image data must be uploaded to Azure Storage once a week.
The solution must minimize infrastructure costs.
What should you do?

A. Use Azure API Apps to upload the data.

B. Use Azure Bot Service to upload the data.

C. Use Azure Data Factory (ADF) to upload the data.

D. Use Azure Machine Learning to upload the data.

 


Suggested Answer: C

The Azure Data Factory (ADF) is a service designed to allow developers to integrate disparate data sources. It is a platform somewhat like SSIS in the cloud to manage the data you have both on-premises and in the cloud.
It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables) and Azure SQL Database.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/introduction
https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/

Question 21

You have an Azure Machine Learning experiment.
You need to validate that the experiment meets GDPR regulation requirements and stores documentation about the experiment.
What should you use?

A. Compliance Manager

B. an Azure Log Analytics workspace

C. Azure Table storage

D. Azure Security Center

 


Suggested Answer: A

Compliance Manager for Azure helps you assess and manage GDPR compliance. Compliance Manager is a free, Microsoft cloud services solution designed to help organizations meet complex compliance obligations, including the GDPR, ISO 27001, ISO 27018, and NIST 800-53. Generally available today for Azure customers, the Compliance Manager GDPR dashboard enables you to assign, track, and record your GDPR compliance activities so you can collaborate across teams and manage your documents for creating audit reports more easily.
References:
https://azure.microsoft.com/en-us/blog/new-capabilities-to-enable-robust-gdpr-compliance/

Question 22

Your company creates a popular mobile game.
The company tracks usage patterns of the game.
You need to provide special offers to users when there is a significant change in the usage patterns.
Which Azure Cognitive Services service should you use?

A. Form Recognizer

B. Bing Autosuggest

C. Text Analytics

D. Anomaly Detector

 


Suggested Answer: D

Reference:
https://azure.microsoft.com/en-gb/services/cognitive-services/anomaly-detector/#features

Question 23

DRAG DROP -
You need to integrate the new Bookings app and the Butler chabot.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-channel-connect-webchat?view=azure-bot-service-4.0

Question 24

DRAG DROP -
You need to build an AI solution that will be shared between several developers and customers.
You plan to write code, host code, and document the runtime all within a single user experience.
You build the environment to host the solution.
Which three actions should you perform in sequence next? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

Step 1: Create an Azure Machine Learning Studio workspace
Step 2: Create a notebook –
You can manage notebooks using the UI, the CLI, and by invoking the Workspace API.
To create a notebook –
1. Click the Workspace button Workspace Icon or the Home button Home Icon in the sidebar. Do one of the following:
Next to any folder, click the Menu Dropdown on the right side of the text and select Create > Notebook. Create Notebook
In the Workspace or a user folder, click Down Caret and select Create > Notebook.
2. In the Create Notebook dialog, enter a name and select the notebook’s primary language.
3. If there are running clusters, the Cluster drop-down displays. Select the cluster to attach the notebook to.
4. Click Create.
Step 3: Create a new experiment –
Create a new experiment by clicking +NEW at the bottom of the Machine Learning Studio window. Select EXPERIMENT > Blank Experiment.
References:
https://docs.azuredatabricks.net/user-guide/notebooks/notebook-manage.html
https://docs.microsoft.com/en-us/azure/machine-learning/service/quickstart-run-cloud-notebook

Question 25

HOTSPOT -
You need to configure security for an Azure Machine Learning service used by groups of data scientists. The groups must have access to only their own experiments and must be able to grant permissions to the members of their team.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/machine-learning-server/operationalize/configure-roles#how-are-roles-assigned
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-assign-roles

Question 26

You are designing an AI solution that will analyze millions of pictures by using Azure HDInsight Hadoop cluster.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Which storage solution should you recommend?

A. Azure Table storage

B. Azure File Storage

C. Azure Data Lake Storage Gen2

D. Azure Databricks File System

 


Suggested Answer: C

Azure Data Lake Store is optimized for storing large amounts of data for reporting and analytical and is geared towards storing data in its native format, making it a great store for non-relational data.
Reference:
https://stackify.com/store-data-azure-understand-azure-data-storage-options/

Question 27

You plan to deploy a bot that will use the following Azure Cognitive Services:
✑ Language Understanding (LUIS)
✑ Text Analytics
Your company's compliance policy states that all data used by the bot must be stored in the on-premises network.
You need to recommend a compute solution to support the planned bot.
What should you include in the recommendation?

A. an Azure Databricks cluster

B. a Docker container

C. Microsoft Machine Learning Server

D. the Azure Machine Learning service

 


Suggested Answer: B

You can deploy LUIS on-premise as Docker Image in a container.
Note: Azure Cognitive LUIS service can be deployed on any hardware or on any host (Linus, Windows and IOS). This feature allows enterprises to quickly train the LUIS model on the cloud and deploy it anywhere which makes Cognitive services to be available truly to every person and every Organization –
ג€Democratizing AIג€.
Reference:
https://www.linkedin.com/pulse/deploying-microsoft-azure-cognitive-luis-service-on-premise-s

Question 28

You need to design an application that will analyze real-time data from financial feeds.
The data will be ingested into Azure IoT Hub. The data must be processed as quickly as possible in the order in which it is ingested.
Which service should you include in the design?

A. Azure Data Factory

B. Azure Queue storage

C. Azure Stream Analytics

D. Azure Notification Hubs

E. Apache Kafka

F. Azure Event Hubs

 


Suggested Answer: C

Stream processing can be handled by Azure Stream Analytics. Azure Stream Analytics can run perpetual queries against an unbounded stream of data. These queries consume streams of data from storage or message brokers, filter and aggregate the data based on temporal windows, and write the results to sinks such as storage, databases, or directly to reports in Power BI. Stream Analytics uses a SQL-based query language that supports temporal and geospatial constructs, and can be extended using JavaScript.
Incorrect Answers:
E: Apache Kafka is used for ingestion, not for stream processing.
F: Azure Event Hubs is used for ingestion, not for stream processing.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/real-time-processing

Question 29

You plan to deploy Azure IoT Edge devices. Each device will store more than 10,000 images locally. Each image is approximately 5 MB.
You need to ensure that the images persist on the devices for 14 days.
What should you use?

A. Azure Stream Analytics on the IoT Edge devices

B. Azure Database for Postgres SQL

C. Azure Blob storage on the IoT Edge devices

D. Microsoft SQL Server on the IoT Edge devices

 


Suggested Answer: C

Azure Blob Storage on IoT Edge provides a block blob and append blob storage solution at the edge. A blob storage module on your IoT Edge device behaves like an Azure blob service, except the blobs are stored locally on your IoT Edge device.
This e is useful where data needs to be stored locally until it can be processed or transferred to the cloud. This data can be videos, images, finance data, hospital data, or any other unstructured data.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/how-to-store-data-blob

Question 30

You design an AI workflow that combines data from multiple data sources for analysis. The data sources are composed of:
✑ JSON files uploaded to an Azure Storage account
✑ On-premises Oracle databases
✑ Azure SQL databases
Which service should you use to ingest the data?

A. Azure Data Factory

B. Azure SQL Data Warehouse

C. Azure Data Lake Storage

D. Azure Databricks

 


Suggested Answer: A

References:
https://docs.microsoft.com/en-us/azure/data-factory/introduction

Question 31

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an app named App1 that uses the Face API.
App1 contains several PersonGroup objects.
You discover that a PersonGroup object for an individual named Ben Smith cannot accept additional entries. The PersonGroup object for Ben Smith contains
10,000 entries.
You need to ensure that additional entries can be added to the PersonGroup object for Ben Smith. The solution must ensure that Ben Smith can be identified by all the entries.
Solution: You modify the custom time interval for the training phase of App1.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

Instead, use a LargePersonGroup. LargePersonGroup and LargeFaceList are collectively referred to as large-scale operations. LargePersonGroup can contain up to 1 million persons, each with a maximum of 248 faces. LargeFaceList can contain up to 1 million faces. The large-scale operations are similar to the conventional PersonGroup and FaceList but have some differences because of the new architecture.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/face/face-api-how-to-topics/how-to-use-large-scale

Question 32

Your company plans to implement an AI solution that will analyze data from IoT devices.
Data from the devices will be analyzed in real time. The results of the analysis will be stored in a SQL database.
You need to recommend a data processing solution that uses the Transact-SQL language.
Which data processing solution should you recommend?

A. Azure Stream Analytics

B. SQL Server Integration Services (SSIS)

C. Azure Event Hubs

D. Azure Machine Learning

 


Suggested Answer: A

References:
https://www.linkedin.com/pulse/getting-started-azure-iot-services-stream-analytics-rob-tiffany

Question 33

A data scientist deploys a deep learning model on an Fsv2 virtual machine.
Data analysis is slow.
You need to recommend which virtual machine series the data scientist must use to ensure that data analysis occurs as quickly as possible.
Which series should you recommend?

A. ND

B. B

C. DC

D. Ev3

 


Suggested Answer: A

The N-series is a family of Azure Virtual Machines with GPU capabilities. GPUs are ideal for compute and graphics-intensive workloads, helping customers to fuel innovation through scenarios like high-end remote visualisation, deep learning and predictive analytics.
The ND-series is focused on training and inference scenarios for deep learning. It uses the NVIDIA Tesla P40 GPUs. The latest version – NDv2 – features the
NVIDIA Tesla V100 GPUs.
References:
https://azure.microsoft.com/en-in/pricing/details/virtual-machines/series/

Question 34

You have an app that records meetings by using speech-to-text capabilities from the Speech Services API.
You discover that when action items are listed at the end of each meeting, the app transcribes the text inaccurately when industry terms are used.
You need to improve the accuracy of the meeting records.
What should you do?

A. Add a phrase list

B. Create a custom wake word

C. Parse the text by using the Language Understanding (LUIS) API

D. Train a custom model by using Custom Translator

 


Suggested Answer: A

Phrase Lists are used to identify known phrases in audio data, like a person’s name or a specific location. By providing a list of phrases, you improve the accuracy of speech recognition.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/get-started-speech-to-text?tabs=script%2Cbrowser%

2Cwindowsinstall&pivots=programming-language-csharp

Question 35

Your company plans to develop a mobile app to provide meeting transcripts by using speech-to-text. Audio from the meetings will be streamed to provide real-time transcription.
You need to recommend which task each meeting participant must perform to ensure that the transcripts of the meetings can identify all participants.
Which task should you recommend?

A. Record the meeting as an MP4.

B. Create a voice signature.

C. Sign up for Azure Speech Services.

D. Sign up as a guest in Azure Active Directory (Azure AD)

 


Suggested Answer: B

The first step is to create voice signatures for the conversation participants. Creating voice signatures is required for efficient speaker identification.
Note: In addition to the standard baseline model used by the Speech Services, you can customize models to your needs with available data, to overcome speech recognition barriers such as speaking style, vocabulary and background noise.
References:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/speech-service/how-to-use-conversation-transcription-service

Question 36

You company's developers have created an Azure Data Factory pipeline that moves data from an on-premises server to Azure Storage. The pipeline consumes
Azure Cognitive Services APIs.
You need to deploy the pipeline. Your solution must minimize custom code.
You use Self-hosted Integration Runtime to move data to the cloud and Azure Logic Apps to consume Cognitive Services APIs.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

A self-hosted Integration Runtime is capable of running copy activity between a cloud data stores and a data store in private network.
Azure Logic Apps helps you orchestrate and integrate different services by providing 100+ ready-to-use connectors, ranging from on-premises SQL Server or SAP to Microsoft Cognitive Services.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-examples-and-scenarios

Question 37

You are developing a Microsoft Bot Framework application. The application consumes structured NoSQL data that must be stored in the cloud.
You implement Azure Blob storage for the application. You want access to the blob store to be controlled by using a role.
You implement Azure Active Directory (Azure AD) integration on the storage account.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

Azure Active Directory (Azure AD) integration for blobs, and queues provides Azure role-based access control (Azure RBAC) for control over a client’s access to resources in a storage account.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 38

Your company is developing an AI solution that will identify inappropriate text in multiple languages.
You need to implement a Cognitive Services API that meets this requirement.
You use the Azure Content Moderator API to identify inappropriate text.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

The Azure Content Moderator API is a cognitive service that checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. When such material is found, the service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order to comply with regulations or maintain the intended environment for users.
Reference:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/content-moderator/overview

Question 39

HOTSPOT -
You plan to create an intelligent bot to handle internal user chats to the help desk of your company. The bot has the following requirements:
✑ Must be able to interpret what a user means.
✑ Must be able to perform multiple tasks for a user.
Must be able to answer questions from an existing knowledge base.
 Image
You need to recommend which solutions meet the requirements.
Which solution should you recommend for each requirement? To answer, drag the appropriate solutions to the correct requirements. Each solution may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: The Language Understanding (LUIS) service
Language Understanding (LUIS) is a cloud-based API service that applies custom machine-learning intelligence to a user’s conversational, natural language text to predict overall meaning, and pull out relevant, detailed information.
Box 2: Text Analytics API –
The Text Analytics API is a cloud-based service that provides advanced natural language processing over raw text, and includes four main functions: sentiment analysis, key phrase extraction, named entity recognition, and language detection.
Box 3: The QnA Maker service –
QnA Maker is a cloud-based Natural Language Processing (NLP) service that easily creates a natural conversational layer over your data. It can be used to find the most appropriate answer for any given natural language input, from your custom knowledge base (KB) of information.
Incorrect Answers:
Dispatch tool library:
If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use Dispatch tool to determine which LUIS model or QnA Maker knowledge base best matches the user input. The dispatch tool does this by creating a single LUIS app to route user input to the correct model.
Reference:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-tutorial-dispatch
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/overview

Question 40

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several API models in Azure Machine Learning Studio.
You deploy the models to a production environment.
You need to monitor the compute performance of the models.
Solution: You create environment files.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection

Question 41

You have deployed several Azure IoT Edge devices for an AI solution. The Azure IoT Edge devices generate measurement data from temperature sensors.
You need a solution to process the sensor data. Your solution must be able to write configuration changes back to the devices.
You make use of Microsoft Azure IoT Hub.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

Reference:
https://azure.microsoft.com/en-us/resources/samples/functions-js-iot-hub-processing/

Question 42

You are designing a solution that will integrate the Bing Web Search API and will return a JSON response. The development team at your company uses C# as its primary development language.
You provide developers with the Bing endpoint.
Which additional component do the developers need to prepare and to retrieve data by using an API call?

A. the subscription ID

B. the API key

C. a query

D. the resource group ID

 


Suggested Answer: C

The Bing Web Search SDK makes it easy to integrate Bing Web Search into your C# application. You instantiate a client, send a request, and receive a response.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/bing-web-search/web-search-sdk-quickstart

Question 43

Your company has 1,000 AI developers who are responsible for provisioning environments in Azure.
You need to control the type, size, and location of the resources that the developers can provision.
What should you use?

A. Azure Key Vault

B. Azure service principals

C. Azure managed identities

D. Azure Security Center

E. Azure Policy

 


Suggested Answer: B

When an application needs access to deploy or configure resources through Azure Resource Manager in Azure Stack, you create a service principal, which is a credential for your application. You can then delegate only the necessary permissions to that service principal.
References:
https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-create-service-principals

Question 44

DRAG DROP -
You develop a custom application that uses a token to connect to Azure Cognitive Services resources.
A new security policy requires that all access keys are changed every 30 days.
You need to recommend a solution to implement the security policy.
Which three actions should you recommend be performed every 30 days? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

Step 1: Generate new keys in the Cognitive Service resources
Reference Image
Step 2: Retrieve a token from the Cognitive Services endpoint
Step 3: Update the custom application to use the new authorization
Each request to an Azure Cognitive Service must include an authentication header. This header passes along a subscription key or access token, which is used to validate your subscription for a service or group of services.
References: alt=”Reference Image” />
Step 2: Retrieve a token from the Cognitive Services endpoint
Step 3: Update the custom application to use the new authorization
Each request to an Azure Cognitive Service must include an authentication header. This header passes along a subscription key or access token, which is used to validate your subscription for a service or group of services.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/authentication

Question 45

DRAG DROP -
You are designing an Azure Batch AI solution that will be used to train many different Azure Machine Learning models. The solution will perform the following:
✑ Image recognition
✑ Deep learning that uses convolutional neural networks.
You need to select a compute infrastructure for each model. The solution must minimize the processing time.
What should you use for each model? To answer, drag the appropriate compute infrastructures to the correct models. Each compute infrastructure may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-gpu

Question 46

You need to build a solution to monitor Twitter. The solution must meet the following requirements:
✑ Send an email message to the marketing department when negative Twitter messages are detected.
✑ Run sentiment analysis on Twitter messages that mention specific tags.
✑ Use the least amount of custom code possible.
Which two services should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Azure Databricks

B. Azure Stream Analytics

C. Azure Functions

D. Azure Cognitive Services

E. Azure Logic Apps

 


Suggested Answer: BE

References:
https://docs.microsoft.com/en-us/azure/stream-analytics/streaming-technologies
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-twitter-sentiment-analysis-trends

Question 47

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You change the permissions of the AKS resource group, and then you create an SSH connection.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

Instead add an SSH key to the node, and then you create an SSH connection.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh

Question 48

You have deployed 1,000 sensors for an AI application that you are developing. The sensors generate large amounts data that is ingested on an hourly basis.
You want your application to analyze the data generated by the sensors in real-time.
Which of the following actions should you take?

A. Make use of Azure Kubernetes Service (AKS)

B. Make use of Azure Cosmos DB

C. Make use of an Azure HDInsight Hadoop cluster

D. Make use of Azure Data Factory

 


Suggested Answer: C

Azure HDInsight makes it easy, fast, and cost-effective to process massive amounts of data.
You can use HDInsight to process streaming data that’s received in real time from a variety of devices.
Reference:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction

Question 49

You have a Face API solution that updates in real time. A pilot of the solution runs successfully on a small dataset.
When you attempt to use the solution on a larger dataset that continually changes, the performance degrades, slowing how long it takes to recognize existing faces.
You need to recommend changes to reduce the time it takes to recognize existing faces without increasing costs.
What should you recommend?

A. Change the solution to use the Computer Vision API instead of the Face API.

B. Separate training into an independent pipeline and schedule the pipeline to run daily.

C. Change the solution to use the Bing Image Search API instead of the Face API.

D. Distribute the face recognition inference process across many Azure Cognitive Services instances.

 


Suggested Answer: B

Incorrect Answers:
A: The purpose of Computer Vision is to inspects each image associated with an incoming article to (1) scrape out written words from the image and (2) determine what types of objects are present in the image.
C: The Bing API provides an experience similar to Bing.com/search by returning search results that Bing determines are relevant to a user’s query. The results include Web pages and may also include images, videos, and more.
D: That would increase cost.
References:
https://github.com/Azure/cognitive-services

Question 50

You are developing a bot for an ecommerce application. The bot will support five languages.
The bot will use Language Understanding (LUIS) to detect the language of the customer, and QnA Maker to answer common customer questions. LUIS supports all the languages.
You need to determine the minimum number of Azure resources that you must create for the bot.
You create one instance of QnA Maker and five instances Language Understanding (LUIS).
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

You need to have a new QnA Maker resource for each language.
If LUIS supports all the languages, you develop a LUIS app for each language. Each LUIS app has a unique app ID, and endpoint log. If you need to provide language understanding for a language LUIS does not support, you can use Microsoft Translator API to translate the utterance into a supported language, submit the utterance to the LUIS endpoint, and receive the resulting scores.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-language-support

Free Access Full AI-100 Practice Questions Free

Want more hands-on practice? Click here to access the full bank of AI-100 practice questions free and reinforce your understanding of all exam objectives.

We update our question sets regularly, so check back often for new and relevant content.

Good luck with your AI-100 certification journey!

Share18Tweet11
Previous Post

ADM-201 Practice Questions Free

Next Post

AI-102 Practice Questions Free

Next Post

AI-102 Practice Questions Free

AI-900 Practice Questions Free

ANS-C00 Practice Questions Free

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Network+ Practice Test

Comptia Security+ Practice Test

A+ Certification Practice Test

Aws Cloud Practitioner Exam Questions

Aws Cloud Practitioner Practice Exam

Comptia A+ Practice Test

  • About
  • DMCA
  • Privacy & Policy
  • Contact

PracticeTestFree.com materials do not contain actual questions and answers from Cisco's Certification Exams. PracticeTestFree.com doesn't offer Real Microsoft Exam Questions. PracticeTestFree.com doesn't offer Real Amazon Exam Questions.

  • Login
  • Sign Up
No Result
View All Result
  • Quesions
    • Cisco
    • AWS
    • Microsoft
    • CompTIA
    • Google
    • ISACA
    • ECCouncil
    • F5
    • GIAC
    • ISC
    • Juniper
    • LPI
    • Oracle
    • Palo Alto Networks
    • PMI
    • RedHat
    • Salesforce
    • VMware
  • Courses
    • CCNA
    • ENCOR
    • VMware vSphere
  • Certificates

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.