I have a use-case where I have an existing hash:
response = { aa: 'aaa', bb: 'bbb' }
I need to add id as one of the keys.
When I use response.merge(id: 'some_id') and then convert it into JSON, I got id as the last element, which I don't want.
I want to insert id: 'some_id' at the beginning of response.
I have tried this, but it doesn't feel good to iterate over it:
new_response = { id: 'some id' }
response.keys.reverse.each {|key| new_response[key] = response[key] }
Basically, I need a similar feature like Ruby Array's unshift.
irb(main):042:0> arr = [1, 2, 3]
=> [1, 2, 3]
irb(main):043:0> arr.unshift(5)
=> [5, 1, 2, 3]
4 Answers 4
response = {aa: 'aaa', bb: 'bbb'}
new_response = {new: 'new_value'}.merge(response)
# => {:new=>"new_value", :aa=>"aaa", :bb=>"bbb"}
4 Comments
response.unshift(cc: 'ccc')(if there is such a method)? If the expected return value should be { cc: 'ccc', aa: 'aaa', bb: 'bbb' }, then what would be the answer?Try converting it to an array and back:
Hash[hash.to_a.unshift([k, v])]
1 Comment
I think you can do this:
response.inject({:new=>"new_value"}) { |h,(k,v)| h[k]=v; h }
or like @sawa
Comments
This is useful for me, because I want "timestamp" to be within the first 128 characters of the json for Splunk to read it, so I wanted to reorder the hash key to ensure timestamp is always at the beginning when converting to_json. Since log lines will be printed very often, I wanted to ensure the fastest, checking different combinations:
Used the following code to figure out which solution is fastest:
require 'securerandom'
require 'active_support/core_ext/hash/reverse_merge'
require 'fruity'
SMALL_HASH = { aa: 'aaa', bb: 'bbb' }.freeze
BIG_HASH = 10_000.times.each_with_object({}) do |_, obj|
obj[SecureRandom.alphanumeric(6).to_sym] = SecureRandom.alphanumeric(128)
end; nil
puts "Running on #{RUBY_VERSION}"
# Running on 2.6.9
compare do
_small_merge { { new: 'new_value' }.merge(SMALL_HASH) }
_small_reverse_merge { SMALL_HASH.reverse_merge(new: 'new_value') }
_small_unshift { Hash[SMALL_HASH.to_a.unshift([:new, 'new_value'])] }
end
# Running each test 32768 times. Test will take about 1 second.
# _small_merge is faster than _small_reverse_merge by 19.999999999999996% ± 10.0%
# _small_reverse_merge is faster than _small_unshift by 1.9x ± 0.1
compare(magnify: 1_000) do
_big_merge { { new: 'new_value' }.merge(BIG_HASH) }
_big_reverse_merge { BIG_HASH.reverse_merge(new: 'new_value') }
_big_unshift { Hash[BIG_HASH.to_a.unshift([:new, 'new_value'])] }
end
# Running each test 1000 times.
# _big_reverse_merge is similar to _big_merge
# _big_merge is faster than _big_unshift by 2x ± 0.1
So I would recommend using the following strategy
{ new: 'new_value' }.merge(aa: 'aaa', bb: 'bbb')
{'b' => 1, 'a' => 2} # => {"b"=>1, "a"=>2}It if was "ordered" in any other sense the key/value pairs would change positions.