Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
  • Login
  • Register
Quesions Library
  • Cisco
    • 200-301
    • 200-901
      • Multiple Choice
      • Drag Drop
    • 350-401
      • Multiple Choice
      • Drag Drop
    • 350-701
    • 300-410
      • Multiple Choice
      • Drag Drop
    • 300-415
      • Multiple Choice
      • Drag Drop
    • 300-425
    • Others
  • AWS
    • CLF-C02
    • SAA-C03
    • SAP-C02
    • ANS-C01
    • Others
  • Microsoft
    • AZ-104
    • AZ-204
    • AZ-305
    • AZ-900
    • AI-900
    • SC-900
    • Others
  • CompTIA
    • SY0-601
    • N10-008
    • 220-1101
    • 220-1102
    • Others
  • Google
    • Associate Cloud Engineer
    • Professional Cloud Architect
    • Professional Cloud DevOps Engineer
    • Others
  • ISACA
    • CISM
    • CRIS
    • Others
  • LPI
    • 101-500
    • 102-500
    • 201-450
    • 202-450
  • Fortinet
    • NSE4_FGT-7.2
  • VMware
  • >>
    • Juniper
    • EC-Council
      • 312-50v12
    • ISC
      • CISSP
    • PMI
      • PMP
    • Palo Alto Networks
    • RedHat
    • Oracle
    • GIAC
    • F5
    • ITILF
    • Salesforce
Contribute
Practice Test Free
  • QUESTIONS
  • COURSES
    • CCNA
    • Cisco Enterprise Core
    • VMware vSphere: Install, Configure, Manage
  • CERTIFICATES
No Result
View All Result
Practice Test Free
No Result
View All Result
Home Practice Test Free

AI-100 Practice Test Free

Table of Contents

Toggle
  • AI-100 Practice Test Free – 50 Real Exam Questions to Boost Your Confidence
  • Free Access Full AI-100 Practice Test Free Questions

AI-100 Practice Test Free – 50 Real Exam Questions to Boost Your Confidence

Preparing for the AI-100 exam? Start with our AI-100 Practice Test Free – a set of 50 high-quality, exam-style questions crafted to help you assess your knowledge and improve your chances of passing on the first try.

Taking a AI-100 practice test free is one of the smartest ways to:

  • Get familiar with the real exam format and question types
  • Evaluate your strengths and spot knowledge gaps
  • Gain the confidence you need to succeed on exam day

Below, you will find 50 free AI-100 practice questions to help you prepare for the exam. These questions are designed to reflect the real exam structure and difficulty level. You can click on each Question to explore the details.

Question 1

You are designing an AI application that will perform real-time processing by using Microsoft Azure Stream Analytics.
You need to identify the valid outputs of a Stream Analytics job.
What are three possible outputs? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. A Hive table in Azure HDInsight

B. Azure SQL Database

C. Azure Cosmos DB

D. Azure Blob storage

E. Azure Redis Cache

 


Suggested Answer: BCD

References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs

Question 2

Your company has deployed 1,000 Internet-connected sensors for an AI application. The sensors generate large amounts new data on an hourly basis.
The data generated by the sensors are currently stored on an on-premises server.
You must meet the following requirements:
Move the data to Azure so that you can perform advanced analytics on the data.
Ensure data persistence.
Keep costs at a minimum.
Which of the following actions should you take?

A. Make use of Azure Blob storage

B. Make use of Azure Cosmos DB

C. Make use of Azure Databricks

D. Make use of Azure Table storage

 


Suggested Answer: A

Reference:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage

Question 3

You need to build a reputation monitoring solution that reviews Twitter activity about your company. The solution must identify negative tweets and tweets that contain inappropriate images.
You plan to use Azure Logic Apps to build the solution.
Which two additional Azure services should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Computer Vision

B. Azure Blueprint

C. Content Moderator

D. Text Analytics

E. Azure Machine Learning Service

F. Form Recognizer

 


Suggested Answer: CD

C: You can filter your tweets using Azure Logic Apps & Content Moderation. Azure Content Moderator is a cognitive service that checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. When this material is found, the service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order to comply with regulations or maintain the intended environment for users.
D: You can write an application so that when a user tweets with configured Twitter Hashtag, Logic App gets triggered and passed to Cognitive Text Analytics
Connector for detecting the sentiments of the tweet (text). If the tweeted text is found to be harsh or with bad or abusive language, the tweet can be handled appropriately.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/overview
https://www.c-sharpcorner.com/article/role-of-text-analytics-service-as-a-connector-in-azure-logic-apps/

Question 4

HOTSPOT -
You have an app that uses the Language Understanding (LUIS) API as shown in the following exhibit.
 Image
Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: train –
Utterances are input from the user that your app needs to interpret. To train LUIS to extract intents and entities from them, it’s important to capture a variety of different example utterances for each intent. Active learning, or the process of continuing to train on new utterances, is essential to machine-learned intelligence that LUIS provides.
Box 2: creating intents –
Each intent needs to have example utterances, at least 15. If you have an intent that does not have any example utterances, you will not be able to train LUIS. If you have an intent with one or very few example utterances, LUIS will not accurately predict the intent.
Box 3: never published –
In each iteration of the model, do not add a large quantity of utterances. Add utterances in quantities of 15. Train, publish, and test again.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-utteran3ce

Question 5

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure SQL database, an Azure Data Lake Storage Gen 2 account, and an API developed by using Azure Machine Learning Studio.
You need to ingest data once daily from the database, score each row by using the API, and write the data to the storage account.
Solution: You create an Azure Data Factory pipeline that contains the Machine Learning Batch Execution activity.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

Using the Batch Execution Activity in an Azure Data Factory pipeline, you can invoke an Azure Machine Learning Studio (classic) web service to make predictions on the data in batch
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/transform-data-using-machine-learning

Question 6

You are developing a Microsoft Bot Framework application. The application consumes structured NoSQL data that must be stored in the cloud.
You implement Azure Blob storage for the application. You want access to the blob store to be controlled by using a role.
You implement Azure Active Directory (Azure AD) integration on the storage account.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: A

Azure Active Directory (Azure AD) integration for blobs, and queues provides Azure role-based access control (Azure RBAC) for control over a client’s access to resources in a storage account.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 7

HOTSPOT -
You plan to deploy the Text Analytics and Computer Vision services. The Azure Cognitive Services will be deployed to the West US and East Europe Azure regions.
You need to identify the minimum number of service endpoints and API keys required for the planned deployment.
What should you identify? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: 2 –
After creating a Cognitive Service resource in the Azure portal, you’ll get an endpoint and a key for authenticating your applications. You can access Azure
Cognitive Services through two different resources: A multi-service resource, or a single-service one.
Multi-service resource: Access multiple Azure Cognitive Services with a single key and endpoint.
Note: You need a key and endpoint for a Text Analytics resource. Azure Cognitive Services are represented by Azure resources that you subscribe to.
Each request must include your access key and an HTTP endpoint. The endpoint specifies the region you chose during sign up, the service URL, and a resource used on the request
Box 2: 2 –
You need at least one key per region.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/cognitive-services-apis-create-account

Question 8

HOTSPOT -

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You plan to deploy an application that will perform image recognition. The application will store image data in two Azure Blob storage stores named Blob1 and
Blob2.
You need to recommend a security solution that meets the following requirements:
✑ Access to Blob1 must be controlled by using a role.
✑ Access to Blob2 must be time-limited and constrained to specific operations.
What should you recommend using to control access to each blob store? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 9

You have an existing Language Understanding (LUIS) model for an internal bot.
You need to recommend a solution to add a meeting reminder functionality to the bot by using a prebuilt model. The solution must minimize the size of the model.
Which component of LUIS should you recommend?

A. domain

B. intents

C. entities

 


Suggested Answer: C

LUIS includes a set of prebuilt entities for recognizing common types of information, like dates, times, numbers, measurements, and currency. Prebuilt entity support varies by the culture of your LUIS app.
Note: LUIS provides three types of prebuilt models. Each model can be added to your app at any time.
Model type: Includes –
✑ Domain: Intents, utterances, entities
✑ Intents: Intents, utterances
✑ Entities: Entities only
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-prebuilt-model

Question 10

You are designing an AI solution that will analyze millions of pictures by using Azure HDInsight Hadoop cluster.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Which storage solution should you recommend?

A. Azure Table storage

B. Azure File Storage

C. Azure Data Lake Storage Gen2

D. Azure Data Lake Storage Gen1

 


Suggested Answer: D

Azure Data Lake Storage Gen1 is adequate and less expensive compared to Gen2.
References:
https://visualbi.com/blogs/microsoft/introduction-azure-data-lake-gen2/

Question 11

Your company plans to create a mobile app that will be used by employees to query the employee handbook.
You need to ensure that the employees can query the handbook by typing or by using speech.
Which core component should you use for the app?

A. Language Understanding (LUIS)

B. QnA Maker

C. Text Analytics

D. Azure Search

 


Suggested Answer: D

Azure Cognitive Search (formerly known as “Azure Search”) is a search-as-a-service cloud solution that gives developers APIs and tools for adding a rich search experience over private, heterogeneous content in web, mobile, and enterprise applications. Your code or a tool invokes data ingestion (indexing) to create and load an index. Optionally, you can add cognitive skills to apply AI processes during indexing. Doing so can add new information and structures useful for search and other scenarios.
Incorrect Answres:
B: QnA Maker is a cloud-based API service that lets you create a conversational question-and-answer layer over your existing data. Use it to build a knowledge base by extracting questions and answers from your semi-structured content, including FAQs, manuals, and documents. Answer users’ questions with the best answers from the QnAs in your knowledge baseג€”automatically.
References:
https://docs.microsoft.com/en-us/azure/search/search-what-is-azure-search

Question 12

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
You are troubleshooting a node issue.
You need to connect to an AKS node by using SSH.
Solution: You add an SSH key to the node, and then you create an SSH connection.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH keys when you created your AKS cluster, add your public SSH keys to the AKS nodes.
You also need to create an SSH connection to the AKS node.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh

Question 13

You are developing a Microsoft Bot Framework app that consumes structured NoSQL data.
The app has the following data storage requirements:
Data must be stored in Azure.
Data persistence must be ensured.
You want to keep costs at a minimum.
Which of the following actions should you take?

A. Make use of Azure Blob storage

B. Make use of Azure Cosmos DB

C. Make use of Azure Databricks

D. Make use of Azure Table storage

 


Suggested Answer: D

Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets.
You can develop applications on Cosmos DB using popular NoSQL APIs.
Both services have a different scenario and pricing model.
While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only region but no failover), indexing by PK/RK and storage- optimized pricing; Azure Cosmos DB Tables aims for high throughput (single-digit millisecond latency), global distribution (multiple failover), SLA-backed predictive performance with automatic indexing of each attribute/property and a pricing model focused on throughput.
Reference:
https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage

Question 14

DRAG DROP -
You are developing an application for photo classification. Users of the application will include minors. The users will upload photos to the application. The photos will be stored for model training purposes. All the photos must be considered appropriate for minors.
You need to recommend an architecture for the application.
Which Azure services should you recommend using in the architecture? To answer, drag the appropriate services to the correct targets. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

 

Question 15

You plan to design an application that will use data from Azure Data Lake and perform sentiment analysis by using Azure Machine Learning algorithms.
The developers of the application use a mix of Windows- and Linux-based environments. The developers contribute to shared GitHub repositories.
You need all the developers to use the same tool to develop the application.
What is the best tool to use? More than one answer choice may achieve the goal.

A. Microsoft Visual Studio Code

B. Azure Notebooks

C. Azure Machine Learning Studio

D. Microsoft Visual Studio

 


Suggested Answer: C

References:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-choice.md

Question 16

Your company develops an API application that is orchestrated by using Kubernetes.
You need to deploy the application.
Which three actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Create a Kubernetes cluster.

B. Create an Azure Container Registry instance.

C. Create a container image file.

D. Create a Web App for Containers.

E. Create an Azure container instance.

 


Suggested Answer: ABC

References:
https://docs.microsoft.com/en-us/azure/aks/tutorial-kubernetes-prepare-app

Question 17

Your company plans to monitor twitter hashtags, and then to build a graph of connected people and places that contains the associated sentiment.
The monitored hashtags use several languages, but the graph will be displayed in English.
You need to recommend the required Azure Cognitive Services endpoints for the planned graph.
Which Cognitive Services endpoints should you recommend?

A. Language Detection, Content Moderator, and Key Phrase Extraction

B. Translator Text, Content Moderator, and Key Phrase Extraction

C. Language Detection, Sentiment Analysis, and Key Phase Extraction

D. Translator Text, Sentiment Analysis, and Named Entity Recognition

 


Suggested Answer: C

Sentiment analysis, which is also called opinion mining, uses social media analytics tools to determine attitudes toward a product or idea.
Translator Text: Translate text in real time across more than 60 languages, powered by the latest innovations in machine translation.
The Key Phrase Extraction skill evaluates unstructured text, and for each record, returns a list of key phrases. This skill uses the machine learning models provided by Text Analytics in Cognitive Services.
This capability is useful if you need to quickly identify the main talking points in the record. For example, given input text “The food was delicious and there were wonderful staff”, the service returns “food” and “wonderful staff”.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-entity-linking
https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-keyphrases

Question 18

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure SQL database, an Azure Data Lake Storage Gen 2 account, and an API developed by using Azure Machine Learning Studio.
You need to ingest data once daily from the database, score each row by using the API, and write the data to the storage account.
Solution: You create a scheduled Jupyter Notebook in Azure Databricks.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

We need to schedule the job in Azure Data Factory.

Question 19

You are using Azure Cognitive Services to create an interactive AI application that will be deployed for a world-wide audience.
You want the app to support multiple languages, including English, French, Spanish, Portuguese, and German.
Which of the following actions should you take?

A. Make use of Text Analytics.

B. Make use of Content Moderator.

C. Make use of QnA Maker.

D. Make use of Language API.

 


Suggested Answer: A

The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each document and returns language identifiers with a score that indicates the strength of the analysis.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-language-detection

Question 20

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an app named App1 that uses the Face API.
App1 contains several PersonGroup objects.
You discover that a PersonGroup object for an individual named Ben Smith cannot accept additional entries. The PersonGroup object for Ben Smith contains
10,000 entries.
You need to ensure that additional entries can be added to the PersonGroup object for Ben Smith. The solution must ensure that Ben Smith can be identified by all the entries.
Solution: You create a second PersonGroup object for Ben Smith.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: B

Instead, use a LargePersonGroup. LargePersonGroup and LargeFaceList are collectively referred to as large-scale operations. LargePersonGroup can contain up to 1 million persons, each with a maximum of 248 faces. LargeFaceList can contain up to 1 million faces. The large-scale operations are similar to the conventional PersonGroup and FaceList but have some differences because of the new architecture.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/face/face-api-how-to-topics/how-to-use-large-scale

Question 21

HOTSPOT -
You are developing an application that will perform clickstream analysis. The application will ingest and analyze millions of messages in the real time.
You need to ensure that communication between the application and devices is bidirectional.
What should you use for data ingestion and stream processing? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: Azure IoT Hub –
Azure IoT Hub is the cloud gateway that connects IoT devices to gather data and drive business insights and automation. In addition, IoT Hub includes features that enrich the relationship between your devices and your backend systems. Bi-directional communication capabilities mean that while you receive data from devices you can also send commands and policies back to devices.
Note on why not Azure Event Hubs: An Azure IoT Hub contains an Event Hub and hence essentially is an Event Hub plus additional features. An important additional feature is that an Event Hub can only receive messages, whereas an IoT Hub additionally can also send messages to individual devices. Further, an
Event Hub has access security on hub level, whereas an IoT Hub is aware of the individual devices and can grand and revoke access on device level.
Box 2: Azure Hdinsight with Azure Machine Learning service
References:
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-compare-event-hubs
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-machine-learning-overview

Question 22

Your company uses an internal blog to share news with employees.
You use the Translator Text API to translate the text in the blog from English to several other languages used by the employees.
Several employees report that the translations are often inaccurate.
You need to improve the accuracy of the translations.
What should you add to the translation solution?

A. Text Analytics

B. Language Understanding (LUIS)

C. Azure Media Services

D. Custom Translator

 


Suggested Answer: D

Custom Translator is a feature of the Microsoft Translator service. With Custom Translator, enterprises, app developers, and language service providers can build neural translation systems that understand the terminology used in their own business and industry. The customized translation system will then seamlessly integrate into existing applications, workflows and websites.
Custom Translator allows users to customize Microsoft Translator’s advanced neural machine translation for Translator’s supported neural translation languages.
Custom Translator can be used for customizing text when using the Microsoft Translator Text API , and speech translation using the Microsoft Speech services.
References:
https://www.microsoft.com/en-us/translator/business/customization/

Question 23

DRAG DROP -
You are designing an Azure Batch AI solution that will be used to train many different Azure Machine Learning models. The solution will perform the following:
✑ Image recognition
✑ Deep learning that uses convolutional neural networks.
You need to select a compute infrastructure for each model. The solution must minimize the processing time.
What should you use for each model? To answer, drag the appropriate compute infrastructures to the correct models. Each compute infrastructure may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

References:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-gpu

Question 24

You have an app that records meetings by using speech-to-text capabilities from the Speech Services API.
You discover that when action items are listed at the end of each meeting, the app transcribes the text inaccurately when industry terms are used.
You need to improve the accuracy of the meeting records.
What should you do?

A. Add a phrase list

B. Create a custom wake word

C. Parse the text by using the Language Understanding (LUIS) API

D. Train a custom model by using Custom Translator

 


Suggested Answer: A

Phrase Lists are used to identify known phrases in audio data, like a person’s name or a specific location. By providing a list of phrases, you improve the accuracy of speech recognition.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/get-started-speech-to-text?tabs=script%2Cbrowser%

2Cwindowsinstall&pivots=programming-language-csharp

Question 25

You are designing a Computer Vision AI application.
You need to recommend a deployment solution for the application. The solution must ensure that costs scale linearly without any upfront costs.
What should you recommend?

A. a containerized Computer Vision API on Azure Kubernetes Service (AKS) that has autoscaling configured

B. the Computer Vision API as a single resource

C. an Azure Container Service

D. a containerized Computer Vision API on Azure Kubernetes Service (AKS) that has virtual nodes configured

 


Suggested Answer: A

Containers enable you to run the Computer Vision APIs in your own environment.
Note: The host is a x64-based computer that runs the Docker container. It can be a computer on your premises or a Docker hosting service in Azure, such as:
✑ Azure Container Instances.
✑ Azure Kubernetes Service.
✑ A Kubernetes cluster deployed to Azure Stack.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/computer-vision/computer-vision-how-to-install-containers

Question 26

You are developing the workflow for an Azure Machine Learning solution. The solution must retrieve data from the following on-premises sources:
Windows Server 2016 File servers
Microsoft SQL Server databases -
Oracle databases -
Which of the following actions should you take?

A. Make use of Azure Data Factory to retrieve the data.

B. Make use of Azure Databricks to retrieve the data.

C. Make use of Azure Stream Analytics to retrieve the data.

D. Make use of Azure Synapse Analytics to retrieve the data.

 


Suggested Answer: A

Reference:
https://docs.microsoft.com/en-us/azure/machine-learning/studio/use-data-from-an-on-premises-sql-server

Question 27

HOTSPOT -
You plan to create a bot that will support five languages. The bot will be used by users located in three different countries. The bot will answer common customer questions. The bot will use Language Understanding (LUIS) to identify which skill to use and to detect the language of the customer.
You need to identify the minimum number of Azure resources that must be created for the planned bot.
How many QnA Maker, LUIS and Text Analytics instances should you create? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

QnA Maker: 5 –
If the user plans to support multiple languages, they need to have a new QnA Maker resource for each language.
LUIS: 5 –
If you need a multi-language LUIS client application such as a chatbot, you have a few options. If LUIS supports all the languages, you develop a LUIS app for each language. Each LUIS app has a unique app ID, and endpoint log. If you need to provide language understanding for a language LUIS does not support, you can use Microsoft Translator API to translate the utterance into a supported language, submit the utterance to the LUIS endpoint, and receive the resulting scores.
Language detection: 1 –
The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each document and returns language identifiers with a score that indicates the strength of the analysis.
This capability is useful for content stores that collect arbitrary text, where language is unknown. You can parse the results of this analysis to determine which language is used in the input document. The response also returns a score that reflects the confidence of the model. The score value is between 0 and 1.
The Language Detection feature can detect a wide range of languages, variants, dialects, and some regional or cultural languages. The exact list of languages for this feature isn’t published.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-language-detection

Question 28

HOTSPOT -
You plan to use Azure Cognitive Services to provide the development team at your company with the ability to create intelligent apps without having direct AI or data science skills.
The company identifies the following requirements for the planned Cognitive Services deployment:
✑ Provide support for the following languages: English, Portuguese, and German.
✑ Perform text analytics to derive a sentiment score.
Which Cognitive Service service should you deploy for each requirement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: Text Analytics –
The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each document and returns language identifiers with a score that indicates the strength of the analysis.
Box 2: Language API –
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-language-detection
https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-sentiment-analysis-cognitive-services

Question 29

You plan to deploy two AI applications named AI1 and AI2. The data for the applications will be stored in a relational database.
You need to ensure that the users of AI1 and AI2 can see only data in each user's respective geographic region. The solution must be enforced at the database level by using row-level security.
Which database solution should you use to store the application data?

A. Microsoft SQL Server on a Microsoft Azure virtual machine

B. Microsoft Azure Database for MySQL

C. Microsoft Azure Data Lake Store

D. Microsoft Azure Cosmos DB

 


Suggested Answer: A

Row-level security is supported by SQL Server, Azure SQL Database, and Azure SQL Data Warehouse.
References:
https://docs.microsoft.com/en-us/sql/relational-databases/security/row-level-security?view=sql-server-2017

Question 30

DRAG DROP -
You use an Azure key vault to store credentials for several Azure Machine Learning applications.
You need to configure the key vault to meet the following requirements:
✑ Ensure that the IT security team can add new passwords and periodically change the passwords.
✑ Ensure that the applications can securely retrieve the passwords for the applications.
✑ Use the principle of least privilege.
Which permissions should you grant? To answer, drag the appropriate permissions to the correct targets. Each permission may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

Incorrect Answers:
Not Keys as they are used for encryption only.
References:
https://docs.microsoft.com/en-us/azure/key-vault/key-vault-secure-your-key-vault

Question 31

You are developing an AI application for your company. The application will use Microsoft Azure Stream Analytics.
You save the outputs from the Stream Analytics workflows to the cloud.
Which of the following actions should you take?

A. Make use of a Hive table in Azure HDInsight

B. Make use of Azure Cosmos DB

C. Make use of Azure File storage

D. Make use of Azure Table storage

 


Suggested Answer: C

Reference:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs

Question 32

You are developing an app that will analyze sensitive data from global users.
Your app must adhere the following compliance policies:
The app must not store data in the cloud.
The app not use services in the cloud to process the data.
Which of the following actions should you take?

A. Make use of Azure Machine Learning Studio

B. Make use of Docker containers for the Text Analytics

C. Make use of a Text Analytics container deployed to Azure Kubernetes Service

D. Make use of Microsoft Machine Learning (MML) for Apache Spark

 


Suggested Answer: D

The Microsoft Machine Learning Library for Apache Spark (MMLSpark) assists in provisioning scalable machine learning models for large datasets, especially for building deep learning problems. MMLSpark works with SparkML pipelines, including Microsoft CNTK and the OpenCV library, which provide end-to-end support for the ingress and processing of image input data, categorization of images, and text analytics using pre-trained deep learning algorithms.
Reference:
https://subscription.packtpub.com/book/big_data_and_business_intelligence/9781789131956/10/ch10lvl1sec61/an-overview-of-the-microsoft-machine-learning-
library-for-apache-spark-mmlspark

Question 33

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several AI models in Azure Machine Learning Studio.
You deploy the models to a production environment.
You need to monitor the compute performance of the models.
Solution: You enable Model data collection.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection

Question 34

You are developing a Microsoft Bot Framework application. The application consumes structured NoSQL data that must be stored in the cloud.
You implement Azure Blob storage for the application. You want access to the blob store to be controlled by using a role.
You implement Shared Key authorization on the storage account.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

A client using Shared Key passes a header with every request that is signed using the storage account access key rather than a role.
Use Azure Active Directory (Azure AD) instead.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth

Question 35

A data scientist deploys a deep learning model on an Fsv2 virtual machine.
Data analysis is slow.
You need to recommend which virtual machine series the data scientist must use to ensure that data analysis occurs as quickly as possible.
Which series should you recommend?

A. ND

B. B

C. DC

D. Ev3

 


Suggested Answer: A

The N-series is a family of Azure Virtual Machines with GPU capabilities. GPUs are ideal for compute and graphics-intensive workloads, helping customers to fuel innovation through scenarios like high-end remote visualisation, deep learning and predictive analytics.
The ND-series is focused on training and inference scenarios for deep learning. It uses the NVIDIA Tesla P40 GPUs. The latest version – NDv2 – features the
NVIDIA Tesla V100 GPUs.
References:
https://azure.microsoft.com/en-in/pricing/details/virtual-machines/series/

Question 36

HOTSPOT -
Your company is building a cinema chatbot by using the Bot Framework and Language Understanding (LUIS).
You are designing of the intents and the entities for LUIS.
The following are utterances that customers might provide:
✑ Which movies are playing on December 8?
✑ What time is the performance of Movie1?
✑ I would like to purchase two adult tickets in the balcony section for Movie2.
You need to identify which entity types to use. The solution must minimize development effort.
Which entry type should you use for each entity? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: Prebuilt –
Datetime is prebuilt.
Language Understanding (LUIS) provides prebuilt entities. When a prebuilt entity is included in your application, LUIS includes the corresponding entity prediction in the endpoint response.
Box 2: Simple –
Box 3: Composite –
A composite entity is made up of other entities, such as prebuilt entities, simple, regular expression, and list entities. The separate entities form a whole entity.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-reference-prebuilt-entities
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/reference-entity-composite

Question 37

You design an AI workflow that combines data from multiple data sources for analysis. The data sources are composed of:
✑ JSON files uploaded to an Azure Storage account
✑ On-premises Oracle databases
✑ Azure SQL databases
Which service should you use to ingest the data?

A. Azure Data Factory

B. Azure SQL Data Warehouse

C. Azure Data Lake Storage

D. Azure Databricks

 


Suggested Answer: A

References:
https://docs.microsoft.com/en-us/azure/data-factory/introduction

Question 38

Your plan to design a bot that will be hosted by using Azure Bot Service.
Your company identifies the following compliance requirements for the bot:
✑ Payment Card Industry Data Security Standards (PCI DSS)
✑ General Data Protection Regulation (GDPR)
✑ ISO 27001
You need to identify which compliance requirements are met by hosting the bot in the bot service.
What should you identify?

A. PCI DSS only

B. PCI DSS, ISO 27001, and GDPR

C. ISO 27001 only

D. GDPR only

 


Suggested Answer: B

Azure Bot service is compliant with ISO 27001:2013, ISO 27019:2014, SOC 1 and 2, Payment Card Industry Data Security Standard (PCI DSS), and Health
Insurance Portability and Accountability Act Business Associate Agreement (HIPAA BAA).
Microsoft products and services, including Azure Bot Service, are available today to help you meet the GDPR requirements.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-compliance
https://blog.botframework.com/2018/04/23/general-data-protection-regulation-gdpr/

Question 39

Your company is developing an AI solution that will identify inappropriate text in multiple languages.
You need to implement a Cognitive Services API that meets this requirement.
You use Language Understanding (LUIS) to identify inappropriate text.
Does this action accomplish your objective?

A. Yes, it does

B. No, it does not

 


Suggested Answer: B

Language Understanding (LUIS) is designed to identify valuable information in conversations, LUIS interprets user goals (intents) and distills valuable information from sentences (entities), for a high quality, nuanced language model. LUIS integrates seamlessly with the Azure Bot Service, making it easy to create a sophisticated bot.
Use Content Moderation instead.
Reference:
https://www.luis.ai/home

https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/content-moderator/overview

Question 40

Your company has a data team of Scala and R experts.
You plan to ingest data from multiple Apache Kafka streams.
You need to recommend a processing technology to broker messages at scale from Kafka streams to Azure Storage.
What should you recommend?

A. Azure Databricks

B. Azure Functions

C. Azure HDInsight with Apache Storm

D. Azure HDInsight with Microsoft Machine Learning Server

 


Suggested Answer: C

Reference:
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-streaming-at-scale-overview?toc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%

2Fhdinsight%2Fhadoop%2FTOC.json&bc=https%3A%2F%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fbread%2Ftoc.json

Question 41

HOTSPOT -
You need to build a sentiment analysis solution that will use input data from JSON documents and PDF documents. The JSON documents must be processed in batches and aggregated.
Which storage type should you use for each file type? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
 Image

 


Suggested Answer:
Correct Answer Image

Box 1: Azure Blob Storage –
The following technologies are recommended choices for batch processing solutions in Azure.
Data storage –
✑ Azure Storage Blob Containers. Many existing Azure business processes already use Azure blob storage, making this a good choice for a big data store.
✑ Azure Data Lake Store. Azure Data Lake Store offers virtually unlimited storage for any size of file, and extensive security options, making it a good choice for extremely large-scale big data solutions that require a centralized store for data in heterogeneous formats.
Box 2: Azure Blob Storage –
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/batch-processing
https://docs.microsoft.com/bs-latn-ba/azure/storage/blobs/storage-blobs-introduction

Question 42

You are designing an AI solution in Azure that will perform image classification.
You need to identify which processing platform will provide you with the ability to update the logic over time. The solution must have the lowest latency for inferencing without having to batch.
Which compute target should you identify?

A. graphics processing units (GPUs)

B. field-programmable gate arrays (FPGAs)

C. central processing units (CPUs)

D. application-specific integrated circuits (ASICs)

 


Suggested Answer: B

FPGAs, such as those available on Azure, provide performance close to ASICs. They are also flexible and reconfigurable over time, to implement new logic.
Incorrect Answers:
D: ASICs are custom circuits, such as Google’s TensorFlow Processor Units (TPU), provide the highest efficiency. They can’t be reconfigured as your needs change.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/concept-accelerate-with-fpgas

Question 43

After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy an Azure Machine Learning model as an IoT Edge module.
Does this meet the goal?

A. Yes

B. No

 


Suggested Answer: A

You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT Edge devices. For example, you can deploy an Azure
Machine Learning module that predicts when a device fails based on simulated machine temperature data.
References:
https://docs.microsoft.com/bs-latn-ba/azure/iot-edge/tutorial-deploy-machine-learning

Question 44

You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will be used for purchase order data. Stream2 will be used for reference data.
The reference data is stored in CSV files.
You need to recommend an ingestion solution for each data stream.
What two solutions should you recommend? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.

A. an Azure event hub for Stream1 and Azure Blob storage for Stream2

B. Azure Blob storage for Stream1 and Stream2

C. an Azure event hub for Stream1 and Stream2

D. Azure Blob storage for Stream1 and Azure Cosmos DB for Stream2

E. Azure Cosmos DB for Stream1 and an Azure event hub for Stream2

 


Suggested Answer: AB

Stream1 – Azure Event –
Stream2 – Blob Storage –
Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of receiving and processing millions of events per second.
Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios.
Stream1, Stream2 – Blob Storage –
Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources:
Azure Event Hubs –
Azure IoT Hub –
Azure Blob storage –
These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion

Question 45

Your company has 1,000 AI developers who are responsible for provisioning environments in Azure.
You need to control the type, size, and location of the resources that the developers can provision.
What should you use?

A. Azure Key Vault

B. Azure service principals

C. Azure managed identities

D. Azure Security Center

E. Azure Policy

 


Suggested Answer: B

When an application needs access to deploy or configure resources through Azure Resource Manager in Azure Stack, you create a service principal, which is a credential for your application. You can then delegate only the necessary permissions to that service principal.
References:
https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-create-service-principals

Question 46

You have thousands of images that contain text.
You need to process the text from the images to a machine-readable character stream.
Which Azure Cognitive Services service should you use?

A. Image Moderation API

B. Text Analytics

C. Translator Text

D. Computer Vision

 


Suggested Answer: D

With Computer Vision you can detect text in an image using optical character recognition (OCR) and extract the recognized words into a machine-readable character stream.
Incorrect Answers:
A: Use Content Moderator’s machine-assisted image moderation and human-in-the-loop Review tool to moderate images for adult and racy content. Scan images for text content and extract that text, and detect faces. You can match images against custom lists, and take further action.
Reference:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/
https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-api

Question 47

You design an AI solution that uses an Azure Stream Analytics job to process data from an Azure IoT hub. The IoT hub receives time series data from thousands of IoT devices at a factory.
The job outputs millions of messages per second. Different applications consume the messages as they are available. The messages must be purged.
You need to choose an output type for the job.
What is the best output type to achieve the goal? More than one answer choice may achieve the goal.

A. Azure Event Hubs

B. Azure SQL Database

C. Azure Blob storage

D. Azure Cosmos DB

 


Suggested Answer: D

Stream Analytics can target Azure Cosmos DB for JSON output, enabling data archiving and low-latency queries on unstructured JSON data.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-documentdb-output

Question 48

You need to design the Butler chatbot solution to meet the technical requirements.
What is the best channel and pricing tier to use? More than one answer choice may achieve the goal. Select the BEST answer.

A. Standard channels that use the S1 pricing tier

B. Standard channels that use the Free pricing tier

C. Premium channels that use the Free pricing tier

D. Premium channels that use the S1 pricing tier

 


Suggested Answer: D

References:
https://azure.microsoft.com/en-in/pricing/details/bot-service/

Question 49

You have Azure IoT Edge devices that generate measurement data from temperature sensors. The data changes very slowly.
You need to analyze the data in a temporal two-minute window. If the temperature rises five degrees above a limit, an alert must be raised. The solution must minimize the development of custom code.
What should you use?

A. A Machine Learning model as a web service

B. an Azure Machine Learning model as an IoT Edge module

C. Azure Stream Analytics as an IoT Edge module

D. Azure Functions as an IoT Edge module

 


Suggested Answer: C

References:
https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-stream-analytics

Question 50

DRAG DROP -
You need to create a bot to meet the following requirements:
✑ The bot must support multiple bot channels including Direct Line.
✑ Users must be able to sign in to the bot by using a Gmail user account and save activities and preferences.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct orders you select.
Select and Place:
 Image

 


Suggested Answer:
Correct Answer Image

Step 1: From the Azure portal, configure an identity provider.
The Azure Bot Service and the v4 SDK include new bot authentication capabilities, providing features to make it easier to develop a bot that authenticates users to various identity providers, such as Azure AD (Azure Active Directory), GitHub, Uber, and so on.
Step 2: From the Azure portal, create an Azure Active Directory (Azure AD) B2C service.
Azure Active Directory B2C provides business-to-customer identity as a service. Your customers use their preferred social, enterprise, or local account identities to get single sign-on access to your applications and APIs.
Step 3: From the Azure portal, create a client application
You can enable communication between your bot and your own client application by using the Direct Line API.
Step 4: From the bot code, add the connection settings and OAuthPrompt
Use an OAuth prompt to sign the user in and get a token.
Azure AD B2C uses standards-based authentication protocols including OpenID Connect, OAuth 2.0, and SAML.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-authentication?view=azure-bot-service-4.0

Free Access Full AI-100 Practice Test Free Questions

If you’re looking for more AI-100 practice test free questions, click here to access the full AI-100 practice test.

We regularly update this page with new practice questions, so be sure to check back frequently.

Good luck with your AI-100 certification journey!

Share18Tweet11
Previous Post

ADM-201 Practice Test Free

Next Post

AI-102 Practice Test Free

Next Post

AI-102 Practice Test Free

AI-900 Practice Test Free

ANS-C00 Practice Test Free

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Recommended

Network+ Practice Test

Comptia Security+ Practice Test

A+ Certification Practice Test

Aws Cloud Practitioner Exam Questions

Aws Cloud Practitioner Practice Exam

Comptia A+ Practice Test

  • About
  • DMCA
  • Privacy & Policy
  • Contact

PracticeTestFree.com materials do not contain actual questions and answers from Cisco's Certification Exams. PracticeTestFree.com doesn't offer Real Microsoft Exam Questions. PracticeTestFree.com doesn't offer Real Amazon Exam Questions.

  • Login
  • Sign Up
No Result
View All Result
  • Quesions
    • Cisco
    • AWS
    • Microsoft
    • CompTIA
    • Google
    • ISACA
    • ECCouncil
    • F5
    • GIAC
    • ISC
    • Juniper
    • LPI
    • Oracle
    • Palo Alto Networks
    • PMI
    • RedHat
    • Salesforce
    • VMware
  • Courses
    • CCNA
    • ENCOR
    • VMware vSphere
  • Certificates

Welcome Back!

Login to your account below

Forgotten Password? Sign Up

Create New Account!

Fill the forms below to register

All fields are required. Log In

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Insert/edit link

Enter the destination URL

Or link to existing content

    No search term specified. Showing recent items. Search or use up and down arrow keys to select an item.