π·
Categories
Random
Search
My bookmarks
Explore
Log In
Sign up
Back to categories
Natural Language Processing
Page 1
Share
Why?
Transformer (machine learning model)
What is the purpose of the BERT and GPT models?
To replace RNN models such as LSTMs
To process sequential input data
To provide pretrained systems that can be fine-tuned for specific tasks
Was this a good question?
π
π
Share
Why?
Transformer (machine learning model)
What is the purpose of the attention mechanism in transformers?
To process tokens sequentially
To process the entire input all at once
To provide relevant information about far-away tokens
Was this a good question?
π
π
Share
Why?
Transformer (machine learning model)
What is the main difference between transformers and RNNs?
Transformers use attention mechanisms, while RNNs use feedforward neural networks
Transformers process tokens sequentially, while RNNs process the entire input all at once
Transformers process the entire input all at once, while RNNs process tokens sequentially
Was this a good question?
π
π
Share
Why?
Prompt engineering
What is chain-of-thought prompting?
A method to generate images from text prompts
A way to improve computer security
A technique to improve machine learning models
Was this a good question?
π
π
Share
Why?
Prompt engineering
What is prompt engineering in artificial intelligence?
A technique to improve image synthesis
A method to train language models
A way to improve computer security
Was this a good question?
π
π
Next