Bulk Update

Similarly, there is also a bulk_update command that can update multiple records in one query, assuming the underlying database supports this.

awesome_users = User.where(payment_status: “on_time”)

# Instead of updating each individual record
awesome_users.each do |user|
user.update(awesome: true)
end

# Just do one bulk update
User.where(payment_status: “on_time”).update_all(awesome: true)

Advertisements

Scheduling to Import CSV file to DB

First lets start writing a schedule task and then adding a task to import the csv.

1. Adding
gem ‘whenever’, :require=>false

2. Have a table by name say products with some fileds: name, size, capacity so on..

3. Have a CSV file with the headers matching the table fields:

“name”,”size”,”capacity”
“abcd1″,”100″,”22”
“abcd2″,”200″,”22”
“abcd3″,”300″,”32”

4.So we have table in our app and a csv file also. Next is too write a scheduler that every hour it import the csv file.

5. As we installed whenever gem it creates a schedule file for us at:

project-> config-> schedule.rb

i have set my output to cron_log as follows: set :output, “/home/workspace/sample2/cron_log.log”

then here is the scheduler,

every :hour do

rake “data:import”

end

6. Now lets create the rake task

Create a file under lib/tasks/data.rake

this is how my data.rake file looks like:

require ‘csv’
namespace :data do
task :import,[:filename] => :environment do
CSV.foreach(‘sample2.csv’, :headers => true) do |row|
Product.create!(row.to_hash)
end
end
end

Done!!!!

want to know how it works…Run below line for updating cronjobs after updating schedule.rb file.

# whenever –update-crontab –set ‘environment=development’
# RAILS_ENV=development bundle exec rake data:import –silent

Link for detail references: Automated Tasks with Cron and Rake, Whenever Gem and also recently Railscasts.com deployed some very interesting casts about this. They cover many cases.