Skip to main content

Questions tagged [sequence]

The tag has no usage guidance.

0 votes
0 answers
6 views

What's the right machine learning approach to mark rubrics based on sequences of data?

I'm a teacher and I'm working on a pet project to help streamline some of my assessment workflows for my students. One of those workflows is gathering data on student progress in the form of a rubric ...
Kevin's user avatar
  • 1
0 votes
0 answers
20 views

Calculating prediction confidence from a sequence of token-level confidences

I am working with OCSR (optical chemical structure recognition) models, and they output a sequence of token-level confidences. I am looking for a method of summarising these token-level confidences ...
finlay morrison's user avatar
0 votes
0 answers
31 views

Predict best chess move using RNNs

I am trying to do a project with AI: in which during any certain moment of a chess game i can predict, using a RNN trained on a kaggle dataset, the best possible move that i can make. I am having ...
user3253067's user avatar
0 votes
0 answers
9 views

Are the filtering problem and decoding problem the same thing?

Is there any distinction the filtering problem and the decoding problem? Wikipedia's definition for a filtering problem is: The problem of estimating the states or ideally the posterior distribution ...
Tommaso Bendinelli's user avatar
0 votes
0 answers
44 views

Improving Wake-Word Detection Model Performance: Seeking Advice and Suggestions

I was assigned a task to train a wake-word detection model. Basically, it's a binary sequence classification model on audio samples where it should be 1 if it recognizes the wake word being said (e.g. ...
Ícaro Lorran's user avatar
0 votes
0 answers
18 views

Conditional density estimation for sequences using conditional random fields

I am looking to estimate the conditional distribution of the next observation $x_{t+1} \in \mathbb{R}_+$ of a discrete-time process, given the current observation and $l$ previous observations. I am ...
Jonas's user avatar
  • 1
0 votes
0 answers
30 views

Why use sliding window input features in deep learning?

I was reading through the DNABERT paper and found that their input features were k-mers. This is equivalent to using rolling/sliding window features in the other common family of sequential problem, ...
Avatrin's user avatar
  • 131
1 vote
1 answer
109 views

Which should we choose: sequence-model vs n-gram model and why does it depend on ratio of Samples / Words per Sample

This ML tutorial from Google is analyzing the imdb reviews dataset to predict the tag positive or negative. When choosing a model Calculate the number of samples/number of words per sample ratio. If ...
Nate Anderson's user avatar
0 votes
1 answer
59 views

Sequence prediction in Parent - Child dataset

We have a large collection of documents (D), each accompanied by a set of metadata (M). Within this collection, some documents act as parent documents and have multiple child documents. Both parent ...
6nagi9's user avatar
  • 101
0 votes
2 answers
272 views

How is PCA applied to (one-hot encoded) DNA sequence data?

I realize some questions have been asked already about one-hot encoding for PCA. The answer seems to be along the lines of 'The PCA will run, but does not necessarily make sense.' However, I have a ...
Chris_abc's user avatar
0 votes
1 answer
50 views

Classification of sequential data

I'm currently trying to classify discrete sequential data into five classes with machine learning. The setup is the following: The actual object is filled with various properties, but to separate the ...
bitfish31's user avatar
1 vote
1 answer
247 views

predictions based on irregular repeated measures?

I need to make a model that predicts certain medical outcomes based on the answer to health-related questionnaires. Providers have patients fill out these questionnaires more than once, at irregular ...
Glenn Wright's user avatar
0 votes
1 answer
373 views

Can MLP model sequential data?

When modeling sequential data, RNNs are introduced as an improvement of MLP as they can model the time dependency between the inputs. It is said that feeding the last N data points in the sequence to ...
Lukas Petersson's user avatar
1 vote
1 answer
2k views

Loss on whole sequences in Causal Language Model

I'd like to know, from an implementation point of view, when training a Causal Transformer such as GPT-2, if making prediction on whole sequence at once and computing the loss on the whole sequence is ...
Valentin Macé's user avatar
1 vote
0 answers
29 views

Techniques for early binary classification from sequences revealed over time in a low data environment?

We have data with objects, each of which has a series of events. A series is 1-50 events, revealed over time (a few months). These objects have events come in at different times during a season, so ...
dfrankow's user avatar
  • 123

15 30 50 per page
1
2 3 4 5
8