Category : model

I’ve a dataset that has different datatypes and used label encoder for encoding the non-numeric values. Then, I split the data to 80-20 % of Train – Test of input/output data. Then, ran the data across multiple models like Cartesian, LogisticRegression, LinearDiscriminantAnalysis, KNeighborsClassifier, DecisionTreeClassifier. Then, I found the best accuracy score out of all the ..

Read more

import numpy as np def frame(x, frame_len, hop_len): assert(x.shape == (len(x), 3)) assert(x.shape[0] >= frame_len) assert(hop_len >= 1) n_frames = 1 + (x.shape[0] – frame_len) // hop_len shape = (n_frames, frame_len, x.shape[1]) strides = ((hop_len * x.strides[0],) + x.strides) return np.lib.stride_tricks.as_strided(x, shape=shape, strides=strides) x_frames = [] y_frames = [] for i in range(x_recordings.shape[0]): # frames ..

Read more

I am studying the Tensorflow classification from textbook and compiled the following code: from sklearn.metrics import classification_report, confusion_matrix def train_model(X_train, y_train, X_test, y_test, learning_rate, max_epochs,batch_size): in_X_tensors_batch = tf.placeholder(tf.float32, shape=(None, RESIZED_IMAGE[0],RESIZED_IMAGE[0],1)) in_y_tensors_batch = tf.placeholder(tf.float32, shape=(None, N_CLASSES)) is_training = tf.placeholder(tf.bool) logits=model(in_X_tensors_batch, is_training) out_y_pred=tf.nn.softmax(logits) loss_score=tf.nn.softmax_cross_entropy_with_logits(logits=logits,labels=in_y_tensors_batch) loss=tf.reduce_mean(loss_score) optimizer=tf.train.AdamOptimizer(learning_rate).minimize(loss) with tf.Session() as session: session.run(tf.global_variables_initializer()) for epoch in range(max_epochs): print("Epoch=", epoch) ..

Read more