How to create a tensor without knowing shape upfront in Keras/TF?
up vote
-1
down vote
favorite
I am currently stuck at a problem. I need to create a temporary variable (of tensor type) for holding values when my custom layer is processing the input. The problem is, the layer gets its inputs in a batch. Batch size is variable. Also, one other dimension of the input is variable which will be provided at the time of running the model.
So, how can I create a tensor of variable dimension every time for a new input. Surely there is a way to do this because built-in layers also take input as a batch whose size is variable and also the shape of output tensor varies as it outputs the whole processed batch. But I don't know how to do it.
Here's the piece of my code:
similarity_matrix = K.zeros(shape=(int_shape(context_vectors)[0], int_shape(context_vectors)[1], int_shape(query_vectors)[1]), dtype='float32')
for i, context_vector in enumerate(modified_context_vectors):
for j, query_vector in enumerate(modified_query_vectors):
concatenated_vector = concatenate_and_multiply(context_vector, query_vector)
result = dot(concatenated_vector, self.kernel)
similarity_matrix[:, i, j] = result
return similarity_matrix
Here, context_vectors and query_vectors are of shape (None, None, 600). The first dimension is batch_size which is same for both. The second dimension is the number of words in context and query and hence will differ for both.
numpy tensorflow keras neural-network tensor
add a comment |
up vote
-1
down vote
favorite
I am currently stuck at a problem. I need to create a temporary variable (of tensor type) for holding values when my custom layer is processing the input. The problem is, the layer gets its inputs in a batch. Batch size is variable. Also, one other dimension of the input is variable which will be provided at the time of running the model.
So, how can I create a tensor of variable dimension every time for a new input. Surely there is a way to do this because built-in layers also take input as a batch whose size is variable and also the shape of output tensor varies as it outputs the whole processed batch. But I don't know how to do it.
Here's the piece of my code:
similarity_matrix = K.zeros(shape=(int_shape(context_vectors)[0], int_shape(context_vectors)[1], int_shape(query_vectors)[1]), dtype='float32')
for i, context_vector in enumerate(modified_context_vectors):
for j, query_vector in enumerate(modified_query_vectors):
concatenated_vector = concatenate_and_multiply(context_vector, query_vector)
result = dot(concatenated_vector, self.kernel)
similarity_matrix[:, i, j] = result
return similarity_matrix
Here, context_vectors and query_vectors are of shape (None, None, 600). The first dimension is batch_size which is same for both. The second dimension is the number of words in context and query and hence will differ for both.
numpy tensorflow keras neural-network tensor
You can only have one variable size, the batch size. The rest has to be fixed. The other dimension indicate the number of connections to other nodes, if they are unknown, how can you train them properly? Answer: you can't.
– Matthieu Brucher
Nov 22 at 14:54
You are wrong brother.. We can have variable input size to a model and I have already built one such model. Please reconsider the question and don't give a downvote unnecessarily. Rather follow it up if you don't have an answer.
– Kadam Parikh
Nov 22 at 14:59
BTW... the question is not dependent on my code snippet. I want to know the way of how to create a tensor of variable shape (take the example of shape (batch_size, 600)). Can you answer this? Don't downvote. Because of you and other such people, we don't get answers here.
– Kadam Parikh
Nov 22 at 15:01
I'm not your brother, and you cannot have a variable size for the model, the kernel sizes are fixed. If you have a double batch_size, then that's different, you are still feeding the same stuff inside a fixedmodel.
– Matthieu Brucher
Nov 22 at 16:08
Use something like: batch_size = Y.get_shape()[0] where Y is a tensor that has a batch_size of None. Then it seems like the dynamic shape is properly set up in the tensors that use this for the sizes (at least this worked for my GANs.)
– Matthieu Brucher
Nov 22 at 16:15
add a comment |
up vote
-1
down vote
favorite
up vote
-1
down vote
favorite
I am currently stuck at a problem. I need to create a temporary variable (of tensor type) for holding values when my custom layer is processing the input. The problem is, the layer gets its inputs in a batch. Batch size is variable. Also, one other dimension of the input is variable which will be provided at the time of running the model.
So, how can I create a tensor of variable dimension every time for a new input. Surely there is a way to do this because built-in layers also take input as a batch whose size is variable and also the shape of output tensor varies as it outputs the whole processed batch. But I don't know how to do it.
Here's the piece of my code:
similarity_matrix = K.zeros(shape=(int_shape(context_vectors)[0], int_shape(context_vectors)[1], int_shape(query_vectors)[1]), dtype='float32')
for i, context_vector in enumerate(modified_context_vectors):
for j, query_vector in enumerate(modified_query_vectors):
concatenated_vector = concatenate_and_multiply(context_vector, query_vector)
result = dot(concatenated_vector, self.kernel)
similarity_matrix[:, i, j] = result
return similarity_matrix
Here, context_vectors and query_vectors are of shape (None, None, 600). The first dimension is batch_size which is same for both. The second dimension is the number of words in context and query and hence will differ for both.
numpy tensorflow keras neural-network tensor
I am currently stuck at a problem. I need to create a temporary variable (of tensor type) for holding values when my custom layer is processing the input. The problem is, the layer gets its inputs in a batch. Batch size is variable. Also, one other dimension of the input is variable which will be provided at the time of running the model.
So, how can I create a tensor of variable dimension every time for a new input. Surely there is a way to do this because built-in layers also take input as a batch whose size is variable and also the shape of output tensor varies as it outputs the whole processed batch. But I don't know how to do it.
Here's the piece of my code:
similarity_matrix = K.zeros(shape=(int_shape(context_vectors)[0], int_shape(context_vectors)[1], int_shape(query_vectors)[1]), dtype='float32')
for i, context_vector in enumerate(modified_context_vectors):
for j, query_vector in enumerate(modified_query_vectors):
concatenated_vector = concatenate_and_multiply(context_vector, query_vector)
result = dot(concatenated_vector, self.kernel)
similarity_matrix[:, i, j] = result
return similarity_matrix
Here, context_vectors and query_vectors are of shape (None, None, 600). The first dimension is batch_size which is same for both. The second dimension is the number of words in context and query and hence will differ for both.
numpy tensorflow keras neural-network tensor
numpy tensorflow keras neural-network tensor
asked Nov 22 at 14:51
Kadam Parikh
73
73
You can only have one variable size, the batch size. The rest has to be fixed. The other dimension indicate the number of connections to other nodes, if they are unknown, how can you train them properly? Answer: you can't.
– Matthieu Brucher
Nov 22 at 14:54
You are wrong brother.. We can have variable input size to a model and I have already built one such model. Please reconsider the question and don't give a downvote unnecessarily. Rather follow it up if you don't have an answer.
– Kadam Parikh
Nov 22 at 14:59
BTW... the question is not dependent on my code snippet. I want to know the way of how to create a tensor of variable shape (take the example of shape (batch_size, 600)). Can you answer this? Don't downvote. Because of you and other such people, we don't get answers here.
– Kadam Parikh
Nov 22 at 15:01
I'm not your brother, and you cannot have a variable size for the model, the kernel sizes are fixed. If you have a double batch_size, then that's different, you are still feeding the same stuff inside a fixedmodel.
– Matthieu Brucher
Nov 22 at 16:08
Use something like: batch_size = Y.get_shape()[0] where Y is a tensor that has a batch_size of None. Then it seems like the dynamic shape is properly set up in the tensors that use this for the sizes (at least this worked for my GANs.)
– Matthieu Brucher
Nov 22 at 16:15
add a comment |
You can only have one variable size, the batch size. The rest has to be fixed. The other dimension indicate the number of connections to other nodes, if they are unknown, how can you train them properly? Answer: you can't.
– Matthieu Brucher
Nov 22 at 14:54
You are wrong brother.. We can have variable input size to a model and I have already built one such model. Please reconsider the question and don't give a downvote unnecessarily. Rather follow it up if you don't have an answer.
– Kadam Parikh
Nov 22 at 14:59
BTW... the question is not dependent on my code snippet. I want to know the way of how to create a tensor of variable shape (take the example of shape (batch_size, 600)). Can you answer this? Don't downvote. Because of you and other such people, we don't get answers here.
– Kadam Parikh
Nov 22 at 15:01
I'm not your brother, and you cannot have a variable size for the model, the kernel sizes are fixed. If you have a double batch_size, then that's different, you are still feeding the same stuff inside a fixedmodel.
– Matthieu Brucher
Nov 22 at 16:08
Use something like: batch_size = Y.get_shape()[0] where Y is a tensor that has a batch_size of None. Then it seems like the dynamic shape is properly set up in the tensors that use this for the sizes (at least this worked for my GANs.)
– Matthieu Brucher
Nov 22 at 16:15
You can only have one variable size, the batch size. The rest has to be fixed. The other dimension indicate the number of connections to other nodes, if they are unknown, how can you train them properly? Answer: you can't.
– Matthieu Brucher
Nov 22 at 14:54
You can only have one variable size, the batch size. The rest has to be fixed. The other dimension indicate the number of connections to other nodes, if they are unknown, how can you train them properly? Answer: you can't.
– Matthieu Brucher
Nov 22 at 14:54
You are wrong brother.. We can have variable input size to a model and I have already built one such model. Please reconsider the question and don't give a downvote unnecessarily. Rather follow it up if you don't have an answer.
– Kadam Parikh
Nov 22 at 14:59
You are wrong brother.. We can have variable input size to a model and I have already built one such model. Please reconsider the question and don't give a downvote unnecessarily. Rather follow it up if you don't have an answer.
– Kadam Parikh
Nov 22 at 14:59
BTW... the question is not dependent on my code snippet. I want to know the way of how to create a tensor of variable shape (take the example of shape (batch_size, 600)). Can you answer this? Don't downvote. Because of you and other such people, we don't get answers here.
– Kadam Parikh
Nov 22 at 15:01
BTW... the question is not dependent on my code snippet. I want to know the way of how to create a tensor of variable shape (take the example of shape (batch_size, 600)). Can you answer this? Don't downvote. Because of you and other such people, we don't get answers here.
– Kadam Parikh
Nov 22 at 15:01
I'm not your brother, and you cannot have a variable size for the model, the kernel sizes are fixed. If you have a double batch_size, then that's different, you are still feeding the same stuff inside a fixedmodel.
– Matthieu Brucher
Nov 22 at 16:08
I'm not your brother, and you cannot have a variable size for the model, the kernel sizes are fixed. If you have a double batch_size, then that's different, you are still feeding the same stuff inside a fixedmodel.
– Matthieu Brucher
Nov 22 at 16:08
Use something like: batch_size = Y.get_shape()[0] where Y is a tensor that has a batch_size of None. Then it seems like the dynamic shape is properly set up in the tensors that use this for the sizes (at least this worked for my GANs.)
– Matthieu Brucher
Nov 22 at 16:15
Use something like: batch_size = Y.get_shape()[0] where Y is a tensor that has a batch_size of None. Then it seems like the dynamic shape is properly set up in the tensors that use this for the sizes (at least this worked for my GANs.)
– Matthieu Brucher
Nov 22 at 16:15
add a comment |
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
active
oldest
votes
Thanks for contributing an answer to Stack Overflow!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Some of your past answers have not been well-received, and you're in danger of being blocked from answering.
Please pay close attention to the following guidance:
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53433488%2fhow-to-create-a-tensor-without-knowing-shape-upfront-in-keras-tf%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You can only have one variable size, the batch size. The rest has to be fixed. The other dimension indicate the number of connections to other nodes, if they are unknown, how can you train them properly? Answer: you can't.
– Matthieu Brucher
Nov 22 at 14:54
You are wrong brother.. We can have variable input size to a model and I have already built one such model. Please reconsider the question and don't give a downvote unnecessarily. Rather follow it up if you don't have an answer.
– Kadam Parikh
Nov 22 at 14:59
BTW... the question is not dependent on my code snippet. I want to know the way of how to create a tensor of variable shape (take the example of shape (batch_size, 600)). Can you answer this? Don't downvote. Because of you and other such people, we don't get answers here.
– Kadam Parikh
Nov 22 at 15:01
I'm not your brother, and you cannot have a variable size for the model, the kernel sizes are fixed. If you have a double batch_size, then that's different, you are still feeding the same stuff inside a fixedmodel.
– Matthieu Brucher
Nov 22 at 16:08
Use something like: batch_size = Y.get_shape()[0] where Y is a tensor that has a batch_size of None. Then it seems like the dynamic shape is properly set up in the tensors that use this for the sizes (at least this worked for my GANs.)
– Matthieu Brucher
Nov 22 at 16:15