Extended maintenance of Ruby versions 1.8.7 and 1.9.2 ended on July 31, 2014. Read more

CSV

This class provides a complete interface to CSV files and data. It offers tools to enable you to read and write to and from Strings or IO objects, as needed.

Reading

From a File

A Line at a Time

CSV.foreach("path/to/file.csv") do |row|
 # use row here...
end

All at Once

arr_of_arrs = CSV.read("path/to/file.csv")

From a String

A Line at a Time

CSV.parse("CSV,data,String") do |row|
 # use row here...
end

All at Once

arr_of_arrs = CSV.parse("CSV,data,String")

Writing

To a File

CSV.open("path/to/file.csv", "wb") do |csv|
 csv << ["row", "of", "CSV", "data"]
 csv << ["another", "row"]
 # ...
end

To a String

csv_string = CSV.generate do |csv|
 csv << ["row", "of", "CSV", "data"]
 csv << ["another", "row"]
 # ...
end

Convert a Single Line

csv_string = ["CSV", "data"].to_csv # to CSV
csv_array = "CSV,String".parse_csv # from CSV

Shortcut Interface

CSV { |csv_out| csv_out << %w{my data here} } # to $stdout
CSV(csv = "") { |csv_str| csv_str << %w{my data here} } # to a String
CSV($stderr) { |csv_err| csv_err << %w{my data here} } # to $stderr

CSV and Character Encodings (M17n or Multilingualization)

This new CSV parser is m17n savvy. The parser works in the Encoding of the IO or String object being read from or written to. Your data is never transcoded (unless you ask Ruby to transcode it for you) and will literally be parsed in the Encoding it is in. Thus CSV will return Arrays or Rows of Strings in the Encoding of your data. This is accomplished by transcoding the parser itself into your Encoding.

Some transcoding must take place, of course, to accomplish this multiencoding support. For example, :col_sep, :row_sep, and :quote_char must be transcoded to match your data. Hopefully this makes the entire process feel transparent, since CSV's defaults should just magically work for you data. However, you can set these values manually in the target Encoding to avoid the translation.

It's also important to note that while all of CSV's core parser is now Encoding agnostic, some features are not. For example, the built-in converters will try to transcode data to UTF-8 before making conversions. Again, you can provide custom converters that are aware of your Encodings to avoid this translation. It's just too hard for me to support native conversions in all of Ruby's Encodings.

Anyway, the practical side of this is simple: make sure IO and String objects passed into CSV have the proper Encoding set and everything should just work. CSV methods that allow you to open IO objects (CSV::foreach(), ::open, ::read, and ::readlines) do allow you to specify the Encoding.

One minor exception comes when generating CSV into a String with an Encoding that is not ASCII compatible. There's no existing data for CSV to use to prepare itself and thus you will probably need to manually specify the desired Encoding for most of those cases. It will try to guess using the fields in a row of output though, when using ::generate_line or Array#to_csv.

I try to point out any other Encoding issues in the documentation of methods as they come up.

This has been tested to the best of my ability with all non-"dummy" Encodings Ruby ships with. However, it is brave new code and may have some bugs. Please feel free to report any issues you find with it.

Constants

ConverterEncoding

The encoding used by all converters.

Converters

This Hash holds the built-in converters of CSV that can be accessed by name. You can select Converters with #convert or through the options Hash passed to ::new.

:integer

Converts any field Integer() accepts.

:float

Converts any field Float() accepts.

:numeric

A combination of :integer and :float.

:date

Converts any field Date::parse() accepts.

:date_time

Converts any field DateTime::parse() accepts.

:all

All built-in converters. A combination of :date_time and :numeric.

All built-in converters transcode field data to UTF-8 before attempting a conversion. If your data cannot be transcoded to UTF-8 the conversion will fail and the field will remain unchanged.

This Hash is intentionally left unfrozen and users should feel free to add values to it that can be accessed by all CSV objects.

To add a combo field, the value should be an Array of names. Combo fields can be nested with other combo fields.

DEFAULT_OPTIONS

The options used when no overrides are given by calling code. They are:

:col_sep

","

:row_sep

:auto

:quote_char

'"'

:field_size_limit

nil

:converters

nil

:unconverted_fields

nil

:headers

false

:return_headers

false

:header_converters

nil

:skip_blanks

false

:force_quotes

false

DateMatcher

A Regexp used to find and convert some common Date formats.

DateTimeMatcher

A Regexp used to find and convert some common DateTime formats.

FieldInfo

A FieldInfo Struct contains details about a field's position in the data source it was read from. CSV will pass this Struct to some blocks that make decisions based on field structure. See CSV.convert_fields() for an example.

index

The zero-based index of the field in its row.

line

The line of the data source this row is from.

header

The header for the column, when available.

HeaderConverters

This Hash holds the built-in header converters of CSV that can be accessed by name. You can select HeaderConverters with #header_convert or through the options Hash passed to ::new.

:downcase

Calls downcase() on the header String.

:symbol

The header String is downcased, spaces are replaced with underscores, non-word characters are dropped, and finally to_sym() is called.

All built-in header converters transcode header data to UTF-8 before attempting a conversion. If your data cannot be transcoded to UTF-8 the conversion will fail and the header will remain unchanged.

This Hash is intetionally left unfrozen and users should feel free to add values to it that can be accessed by all CSV objects.

To add a combo field, the value should be an Array of names. Combo fields can be nested with other combo fields.

VERSION

The version of the installed library.

Attributes

col_sep[R]

The encoded :col_sep used in parsing and writing. See ::new for details.

encoding[R]

The Encoding CSV is parsing or writing in. This will be the Encoding you receive parsed data in and/or the Encoding data will be written in.

field_size_limit[R]

The limit for field size, if any. See ::new for details.

lineno[R]

The line number of the last row read from this file. Fields with nested line-end characters will not affect this count.

quote_char[R]

The encoded :quote_char used in parsing and writing. See ::new for details.

row_sep[R]

The encoded :row_sep used in parsing and writing. See ::new for details.

Public Class Methods

dump(ary_of_objs, io = "", options = Hash.new) click to toggle source

This method allows you to serialize an Array of Ruby objects to a String or File of CSV data. This is not as powerful as Marshal or YAML, but perhaps useful for spreadsheet and database interaction.

Out of the box, this method is intended to work with simple data objects or Structs. It will serialize a list of instance variables and/or Struct.members().

If you need need more complicated serialization, you can control the process by adding methods to the class to be serialized.

A class method csv_meta() is responsible for returning the first row of the document (as an Array). This row is considered to be a Hash of the form key_1,value_1,key_2,value_2,... ::load expects to find a class key with a value of the stringified class name and ::dump will create this, if you do not define this method. This method is only called on the first object of the Array.

The next method you can provide is an instance method called csv_headers(). This method is expected to return the second line of the document (again as an Array), which is to be used to give each column a header. By default, ::load will set an instance variable if the field header starts with an @ character or call send() passing the header as the method name and the field value as an argument. This method is only called on the first object of the Array.

Finally, you can provide an instance method called csv_dump(), which will be passed the headers. This should return an Array of fields that can be serialized for this object. This method is called once for every object in the Array.

The io parameter can be used to serialize to a File, and options can be anything ::new accepts.

 
 # File csv.rb, line 1039
def self.dump(ary_of_objs, io = "", options = Hash.new)
 obj_template = ary_of_objs.first
 csv = new(io, options)
 # write meta information
 begin
 csv << obj_template.class.csv_meta
 rescue NoMethodError
 csv << [:class, obj_template.class]
 end
 # write headers
 begin
 headers = obj_template.csv_headers
 rescue NoMethodError
 headers = obj_template.instance_variables.sort
 if obj_template.class.ancestors.find { |cls| cls.to_s =~ /\AStruct\b/ }
 headers += obj_template.members.map { |mem| "#{mem}=" }.sort
 end
 end
 csv << headers
 # serialize each object
 ary_of_objs.each do |obj|
 begin
 csv << obj.csv_dump(headers)
 rescue NoMethodError
 csv << headers.map do |var|
 if var[0] == ?@
 obj.instance_variable_get(var)
 else
 obj[var[0..-2]]
 end
 end
 end
 end
 if io.is_a? String
 csv.string
 else
 csv.close
 end
end
 
filter( options = Hash.new ) { |row| ... } click to toggle source
filter( input, options = Hash.new ) { |row| ... }
filter( input, output, options = Hash.new ) { |row| ... }

This method is a convenience for building Unix-like filters for CSV data. Each row is yielded to the provided block which can alter it as needed. After the block returns, the row is appended to output altered or not.

The input and output arguments can be anything ::new accepts (generally String or IO objects). If not given, they default to ARGF and $stdout.

The options parameter is also filtered down to ::new after some clever key parsing. Any key beginning with :in_ or :input_ will have that leading identifier stripped and will only be used in the options Hash for the input object. Keys starting with :out_ or :output_ affect only output. All other keys are assigned to both objects.

The :output_row_sep option defaults to $INPUT_RECORD_SEPARATOR ($/).

 
 # File csv.rb, line 1155
def self.filter(*args)
 # parse options for input, output, or both
 in_options, out_options = Hash.new, {row_sep: $INPUT_RECORD_SEPARATOR}
 if args.last.is_a? Hash
 args.pop.each do |key, value|
 case key.to_s
 when /\Ain(?:put)?_(.+)\Z/
 in_options[1ドル.to_sym] = value
 when /\Aout(?:put)?_(.+)\Z/
 out_options[1ドル.to_sym] = value
 else
 in_options[key] = value
 out_options[key] = value
 end
 end
 end
 # build input and output wrappers
 input = new(args.shift || ARGF, in_options)
 output = new(args.shift || $stdout, out_options)
 # read, yield, write
 input.each do |row|
 yield row
 output << row
 end
end
 
foreach(path, options = Hash.new, &block) click to toggle source

This method is intended as the primary interface for reading CSV files. You pass a path and any options you wish to set for the read. Each row of file will be passed to the provided block in turn.

The options parameter can be anything ::new understands. This method also understands an additional :encoding parameter that you can use to specify the Encoding of the data in the file to be read. You must provide this unless your data is in Encoding::default_external(). CSV will use this to determine how to parse the data. You may provide a second Encoding to have the data transcoded as it is read. For example, encoding: "UTF-32BE:UTF-8" would read UTF-32BE data from the file but transcode it to UTF-8 before CSV parses it.

 
 # File csv.rb, line 1196
def self.foreach(path, options = Hash.new, &block)
 encoding = options.delete(:encoding)
 mode = "rb"
 mode << ":#{encoding}" if encoding
 open(path, mode, options) do |csv|
 csv.each(&block)
 end
end
 
generate( str, options = Hash.new ) { |csv| ... } click to toggle source
generate( options = Hash.new ) { |csv| ... }

This method wraps a String you provide, or an empty default String, in a CSV object which is passed to the provided block. You can use the block to append CSV rows to the String and when the block exits, the final String will be returned.

Note that a passed String is modfied by this method. Call dup() before passing if you need a new String.

The options parameter can be anything ::new understands. This method understands an additional :encoding parameter when not passed a String to set the base Encoding for the output. CSV needs this hint if you plan to output non-ASCII compatible data.

 
 # File csv.rb, line 1223
def self.generate(*args)
 # add a default empty String, if none was given
 if args.first.is_a? String
 io = StringIO.new(args.shift)
 io.seek(0, IO::SEEK_END)
 args.unshift(io)
 else
 encoding = args.last.is_a?(Hash) ? args.last.delete(:encoding) : nil
 str = ""
 str.encode!(encoding) if encoding
 args.unshift(str)
 end
 csv = new(*args) # wrap
 yield csv # yield for appending
 csv.string # return final String
end
 
generate_line(row, options = Hash.new) click to toggle source

This method is a shortcut for converting a single row (Array) into a CSV String.

The options parameter can be anything ::new understands. This method understands an additional :encoding parameter to set the base Encoding for the output. This method will try to guess your Encoding from the first non-nil field in row, if possible, but you may need to use this parameter as a backup plan.

The :row_sep option defaults to $INPUT_RECORD_SEPARATOR ($/) when calling this method.

 
 # File csv.rb, line 1253
def self.generate_line(row, options = Hash.new)
 options = {row_sep: $INPUT_RECORD_SEPARATOR}.merge(options)
 encoding = options.delete(:encoding)
 str = ""
 if encoding
 str.force_encoding(encoding)
 elsif field = row.find { |f| not f.nil? }
 str.force_encoding(String(field).encoding)
 end
 (new(str, options) << row).string
end
 
instance(data = $stdout, options = Hash.new) click to toggle source

This method will return a CSV instance, just like ::new, but the instance will be cached and returned for all future calls to this method for the same data object (tested by Object#object_id()) with the same options.

If a block is given, the instance is passed to the block and the return value becomes the return value of the block.

 
 # File csv.rb, line 988
def self.instance(data = $stdout, options = Hash.new)
 # create a _signature_ for this method call, data object and options
 sig = [data.object_id] +
 options.values_at(*DEFAULT_OPTIONS.keys.sort_by { |sym| sym.to_s })
 # fetch or create the instance for this signature
 @@instances ||= Hash.new
 instance = (@@instances[sig] ||= new(data, options))
 if block_given?
 yield instance # run block, if given, returning result
 else
 instance # or return the instance
 end
end
 
load(io_or_str, options = Hash.new) click to toggle source

This method is the reading counterpart to ::dump. See that method for a detailed description of the process.

You can customize loading by adding a class method called csv_load() which will be passed a Hash of meta information, an Array of headers, and an Array of fields for the object the method is expected to return.

Remember that all fields will be Strings after this load. If you need something else, use options to setup converters or provide a custom csv_load() implementation.

 
 # File csv.rb, line 1096
def self.load(io_or_str, options = Hash.new)
 csv = new(io_or_str, options)
 # load meta information
 meta = Hash[*csv.shift]
 cls = meta["class".encode(csv.encoding)].split("::".encode(csv.encoding)).
 inject(Object) do |c, const|
 c.const_get(const)
 end
 # load headers
 headers = csv.shift
 # unserialize each object stored in the file
 results = csv.inject(Array.new) do |all, row|
 begin
 obj = cls.csv_load(meta, headers, row)
 rescue NoMethodError
 obj = cls.allocate
 headers.zip(row) do |name, value|
 if name[0] == ?@
 obj.instance_variable_set(name, value)
 else
 obj.send(name, value)
 end
 end
 end
 all << obj
 end
 csv.close unless io_or_str.is_a? String
 results
end
 
new(data, options = Hash.new) click to toggle source

This constructor will wrap either a String or IO object passed in data for reading and/or writing. In addition to the CSV instance methods, several IO methods are delegated. (See ::open for a complete list.) If you pass a String for data, you can later retrieve it (after writing to it, for example) with CSV.string().

Note that a wrapped String will be positioned at at the beginning (for reading). If you want it at the end (for writing), use ::generate. If you want any other positioning, pass a preset StringIO object instead.

You may set any reading and/or writing preferences in the options Hash. Available options are:

:col_sep

The String placed between each field. This String will be transcoded into the data's Encoding before parsing.

:row_sep

The String appended to the end of each row. This can be set to the special :auto setting, which requests that CSV automatically discover this from the data. Auto-discovery reads ahead in the data looking for the next "\r\n", "\n", or "\r" sequence. A sequence will be selected even if it occurs in a quoted field, assuming that you would have the same line endings there. If none of those sequences is found, data is ARGF, STDIN, STDOUT, or STDERR, or the stream is only available for output, the default $INPUT_RECORD_SEPARATOR ($/) is used. Obviously, discovery takes a little time. Set manually if speed is important. Also note that IO objects should be opened in binary mode on Windows if this feature will be used as the line-ending translation can cause problems with resetting the document position to where it was before the read ahead. This String will be transcoded into the data's Encoding before parsing.

:quote_char

The character used to quote fields. This has to be a single character String. This is useful for application that incorrectly use ' as the quote character instead of the correct ". CSV will always consider a double sequence this character to be an escaped quote. This String will be transcoded into the data's Encoding before parsing.

:field_size_limit

This is a maximum size CSV will read ahead looking for the closing quote for a field. (In truth, it reads to the first line ending beyond this size.) If a quote cannot be found within the limit CSV will raise a MalformedCSVError, assuming the data is faulty. You can use this limit to prevent what are effectively DoS attacks on the parser. However, this limit can cause a legitimate parse to fail and thus is set to nil, or off, by default.

:converters

An Array of names from the Converters Hash and/or lambdas that handle custom conversion. A single converter doesn't have to be in an Array. All built-in converters try to transcode fields to UTF-8 before converting. The conversion will fail if the data cannot be transcoded, leaving the field unchanged.

:unconverted_fields

If set to true, an unconverted_fields() method will be added to all returned rows (Array or CSV::Row) that will return the fields as they were before conversion. Note that :headers supplied by Array or String were not fields of the document and thus will have an empty Array attached.

:headers

If set to :first_row or true, the initial row of the CSV file will be treated as a row of headers. If set to an Array, the contents will be used as the headers. If set to a String, the String is run through a call of ::parse_line with the same :col_sep, :row_sep, and :quote_char as this instance to produce an Array of headers. This setting causes #shift to return rows as CSV::Row objects instead of Arrays and #read to return CSV::Table objects instead of an Array of Arrays.

:return_headers

When false, header rows are silently swallowed. If set to true, header rows are returned in a CSV::Row object with identical headers and fields (save that the fields do not go through the converters).

:write_headers

When true and :headers is set, a header row will be added to the output.

:header_converters

Identical in functionality to :converters save that the conversions are only made to header rows. All built-in converters try to transcode headers to UTF-8 before converting. The conversion will fail if the data cannot be transcoded, leaving the header unchanged.

:skip_blanks

When set to a true value, CSV will skip over any rows with no content.

:force_quotes

When set to a true value, CSV will quote all CSV fields it creates.

See CSV::DEFAULT_OPTIONS for the default settings.

Options cannot be overriden in the instance methods for performance reasons, so be sure to set what you want here.

 
 # File csv.rb, line 1551
def initialize(data, options = Hash.new)
 # build the options for this read/write
 options = DEFAULT_OPTIONS.merge(options)
 # create the IO object we will read from
 @io = if data.is_a? String then StringIO.new(data) else data end
 # honor the IO encoding if we can, otherwise default to ASCII-8BIT
 @encoding = raw_encoding || Encoding.default_internal || Encoding.default_external
 #
 # prepare for building safe regular expressions in the target encoding,
 # if we can transcode the needed characters
 #
 @re_esc = "\\".encode(@encoding) rescue ""
 @re_chars = %w[ \\ . [ ] - ^ $ ?
 * + { } ( ) | #
 \ \r \n \t \f \v ].
 map { |s| s.encode(@encoding) rescue nil }.compact
 init_separators(options)
 init_parsers(options)
 init_converters(options)
 init_headers(options)
 unless options.empty?
 raise ArgumentError, "Unknown options: #{options.keys.join(', ')}."
 end
 # track our own lineno since IO gets confused about line-ends is CSV fields
 @lineno = 0
end
 
open( filename, mode = "rb", options = Hash.new ) { |faster_csv| ... } click to toggle source
open( filename, options = Hash.new ) { |faster_csv| ... }
open( filename, mode = "rb", options = Hash.new )
open( filename, options = Hash.new )

This method opens an IO object, and wraps that with CSV. This is intended as the primary interface for writing a CSV file.

You must pass a filename and may optionally add a mode for Ruby's open(). You may also pass an optional Hash containing any options ::new understands as the final argument.

This method works like Ruby's open() call, in that it will pass a CSV object to a provided block and close it when the block terminates, or it will return the CSV object when no block is provided. (Note: This is different from the Ruby 1.8 CSV library which passed rows to the block. Use ::foreach for that behavior.)

You must provide a mode with an embedded Encoding designator unless your data is in Encoding::default_external(). CSV will check the Encoding of the underlying IO object (set by the mode you pass) to determine how to parse the data. You may provide a second Encoding to have the data transcoded as it is read just as you can with a normal call to IO::open(). For example, "rb:UTF-32BE:UTF-8" would read UTF-32BE data from the file but transcode it to UTF-8 before CSV parses it.

An opened CSV object will delegate to many IO methods for convenience. You may call:

  • binmode()

  • binmode?()

  • close()

  • close_read()

  • close_write()

  • closed?()

  • eof()

  • eof?()

  • external_encoding()

  • fcntl()

  • fileno()

  • flock()

  • flush()

  • fsync()

  • internal_encoding()

  • ioctl()

  • isatty()

  • path()

  • pid()

  • pos()

  • pos=()

  • reopen()

  • seek()

  • stat()

  • sync()

  • sync=()

  • tell()

  • to_i()

  • to_io()

  • truncate()

  • tty?()

 
 # File csv.rb, line 1328
def self.open(*args)
 # find the +options+ Hash
 options = if args.last.is_a? Hash then args.pop else Hash.new end
 # default to a binary open mode
 args << "rb" if args.size == 1
 # wrap a File opened with the remaining +args+
 csv = new(File.open(*args), options)
 # handle blocks like Ruby's open(), not like the CSV library
 if block_given?
 begin
 yield csv
 ensure
 csv.close
 end
 else
 csv
 end
end
 
parse( str, options = Hash.new ) { |row| ... } click to toggle source
parse( str, options = Hash.new )

This method can be used to easily parse CSV out of a String. You may either provide a block which will be called with each row of the String in turn, or just use the returned Array of Arrays (when no block is given).

You pass your str to read from, and an optional options Hash containing anything ::new understands.

 
 # File csv.rb, line 1360
def self.parse(*args, &block)
 csv = new(*args)
 if block.nil? # slurp contents, if no block is given
 begin
 csv.read
 ensure
 csv.close
 end
 else # or pass each row to a provided block
 csv.each(&block)
 end
end
 
parse_line(line, options = Hash.new) click to toggle source

This method is a shortcut for converting a single line of a CSV String into a into an Array. Note that if line contains multiple rows, anything beyond the first row is ignored.

The options parameter can be anything ::new understands.

 
 # File csv.rb, line 1380
def self.parse_line(line, options = Hash.new)
 new(line, options).shift
end
 
read(path, options = Hash.new) click to toggle source

Use to slurp a CSV file into an Array of Arrays. Pass the path to the file and any options ::new understands. This method also understands an additional :encoding parameter that you can use to specify the Encoding of the data in the file to be read. You must provide this unless your data is in Encoding::default_external(). CSV will use this to determine how to parse the data. You may provide a second Encoding to have the data transcoded as it is read. For example, encoding: "UTF-32BE:UTF-8" would read UTF-32BE data from the file but transcode it to UTF-8 before CSV parses it.

 
 # File csv.rb, line 1395
def self.read(path, options = Hash.new)
 encoding = options.delete(:encoding)
 mode = "rb"
 mode << ":#{encoding}" if encoding
 open(path, mode, options) { |csv| csv.read }
end
 
readlines(*args) click to toggle source

Alias for ::read.

 
 # File csv.rb, line 1403
def self.readlines(*args)
 read(*args)
end
 
table(path, options = Hash.new) click to toggle source

A shortcut for:

CSV.read( path, { headers: true,
 converters: :numeric,
 header_converters: :symbol }.merge(options) )
 
 # File csv.rb, line 1414
def self.table(path, options = Hash.new)
 read( path, { headers: true,
 converters: :numeric,
 header_converters: :symbol }.merge(options) )
end
 

Public Instance Methods

<<(row) click to toggle source

The primary write method for wrapped Strings and IOs, row (an Array or CSV::Row) is converted to CSV and appended to the data source. When a CSV::Row is passed, only the row's fields() are appended to the output.

The data source must be open for writing.

 
 # File csv.rb, line 1688
def <<(row)
 # make sure headers have been assigned
 if header_row? and [Array, String].include? @use_headers.class
 parse_headers # won't read data for Array or String
 self << @headers if @write_headers
 end
 # handle CSV::Row objects and Hashes
 row = case row
 when self.class::Row then row.fields
 when Hash then @headers.map { |header| row[header] }
 else row
 end
 @headers = row if header_row?
 @lineno += 1
 output = row.map(&@quote).join(@col_sep) + @row_sep # quote and separate
 if @io.is_a?(StringIO) and
 output.encoding != raw_encoding and
 (compatible_encoding = Encoding.compatible?(@io.string, output))
 @io = StringIO.new(@io.string.force_encoding(compatible_encoding))
 @io.seek(0, IO::SEEK_END)
 end
 @io << output
 self # for chaining
end
 
Also aliased as: add_row, puts
add_row(row) click to toggle source
Alias for: <<
convert( name ) click to toggle source
convert { |field| ... }
convert { |field, field_info| ... }

You can use this method to install a CSV::Converters built-in, or provide a block that handles a custom conversion.

If you provide a block that takes one argument, it will be passed the field and is expected to return the converted value or the field itself. If your block takes two arguments, it will also be passed a CSV::FieldInfo Struct, containing details about the field. Again, the block should return a converted field or the field itself.

 
 # File csv.rb, line 1734
def convert(name = nil, &converter)
 add_converter(:converters, self.class::Converters, name, &converter)
end
 
converters() click to toggle source

Returns the current list of converters in effect. See ::new for details. Built-in converters will be returned by name, while others will be returned as is.

 
 # File csv.rb, line 1604
def converters
 @converters.map do |converter|
 name = Converters.rassoc(converter)
 name ? name.first : converter
 end
end
 
each() click to toggle source

Yields each row of the data source in turn.

Support for Enumerable.

The data source must be open for reading.

 
 # File csv.rb, line 1765
def each
 while row = shift
 yield row
 end
end
 
force_quotes?() click to toggle source

Returns true if all output fields are quoted. See ::new for details.

 
 # File csv.rb, line 1647
def force_quotes?() @force_quotes end
 
gets() click to toggle source
Alias for: shift
header_convert( name ) click to toggle source
header_convert { |field| ... }
header_convert { |field, field_info| ... }

Identical to #convert, but for header rows.

Note that this method must be called before header rows are read to have any effect.

 
 # File csv.rb, line 1749
def header_convert(name = nil, &converter)
 add_converter( :header_converters,
 self.class::HeaderConverters,
 name,
 &converter )
end
 
header_converters() click to toggle source

Returns the current list of converters in effect for headers. See ::new for details. Built-in converters will be returned by name, while others will be returned as is.

 
 # File csv.rb, line 1635
def header_converters
 @header_converters.map do |converter|
 name = HeaderConverters.rassoc(converter)
 name ? name.first : converter
 end
end
 
header_row?() click to toggle source

Returns true if the next row read will be a header row.

 
 # File csv.rb, line 1787
def header_row?
 @use_headers and @headers.nil?
end
 
headers() click to toggle source

Returns nil if headers will not be used, true if they will but have not yet been read, or the actual headers after they have been read. See ::new for details.

 
 # File csv.rb, line 1620
def headers
 @headers || true if @use_headers
end
 
inspect() click to toggle source

Returns a simplified description of the key CSV attributes in an ASCII compatible String.

 
 # File csv.rb, line 1943
def inspect
 str = ["<#", self.class.to_s, " io_type:"]
 # show type of wrapped IO
 if @io == $stdout then str << "$stdout"
 elsif @io == $stdin then str << "$stdin"
 elsif @io == $stderr then str << "$stderr"
 else str << @io.class.to_s
 end
 # show IO.path(), if available
 if @io.respond_to?(:path) and (p = @io.path)
 str << " io_path:" << p.inspect
 end
 # show encoding
 str << " encoding:" << @encoding.name
 # show other attributes
 %w[ lineno col_sep row_sep
 quote_char skip_blanks ].each do |attr_name|
 if a = instance_variable_get("@#{attr_name}")
 str << " " << attr_name << ":" << a.inspect
 end
 end
 if @use_headers
 str << " headers:" << headers.inspect
 end
 str << ">"
 begin
 str.join
 rescue # any encoding error
 str.map do |s|
 e = Encoding::Converter.asciicompat_encoding(s.encoding)
 e ? s.encode(e) : s.force_encoding("ASCII-8BIT")
 end.join
 end
end
 
puts(row) click to toggle source
Alias for: <<
read() click to toggle source

Slurps the remaining rows and returns an Array of Arrays.

The data source must be open for reading.

 
 # File csv.rb, line 1776
def read
 rows = to_a
 if @use_headers
 Table.new(rows)
 else
 rows
 end
end
 
Also aliased as: readlines
readline() click to toggle source
Alias for: shift
readlines() click to toggle source
Alias for: read
return_headers?() click to toggle source

Returns true if headers will be returned as a row of results. See ::new for details.

 
 # File csv.rb, line 1627
def return_headers?() @return_headers end
 
rewind() click to toggle source

Rewinds the underlying IO object and resets CSV's lineno() counter.

 
 # File csv.rb, line 1672
def rewind
 @headers = nil
 @lineno = 0
 @io.rewind
end
 
shift() click to toggle source

The primary read method for wrapped Strings and IOs, a single row is pulled from the data source, parsed and returned as an Array of fields (if header rows are not used) or a CSV::Row (when header rows are used).

The data source must be open for reading.

 
 # File csv.rb, line 1798
def shift
 #########################################################################
 ### This method is purposefully kept a bit long as simple conditional ###
 ### checks are faster than numerous (expensive) method calls. ###
 #########################################################################
 # handle headers not based on document content
 if header_row? and @return_headers and
 [Array, String].include? @use_headers.class
 if @unconverted_fields
 return add_unconverted_fields(parse_headers, Array.new)
 else
 return parse_headers
 end
 end
 # begin with a blank line, so we can always add to it
 line = ""
 #
 # it can take multiple calls to <tt>@io.gets()</tt> to get a full line,
 # because of \r and/or \n characters embedded in quoted fields
 #
 in_extended_col = false
 csv = Array.new
 loop do
 # add another read to the line
 unless parse = @io.gets(@row_sep)
 return nil
 end
 parse.sub!(@parsers[:line_end], "")
 if csv.empty?
 #
 # I believe a blank line should be an <tt>Array.new</tt>, not Ruby 1.8
 # CSV's <tt>[nil]</tt>
 #
 if parse.empty?
 @lineno += 1
 if @skip_blanks
 next
 elsif @unconverted_fields
 return add_unconverted_fields(Array.new, Array.new)
 elsif @use_headers
 return self.class::Row.new(Array.new, Array.new)
 else
 return Array.new
 end
 end
 end
 parts = parse.split(@col_sep, -1)
 if parts.empty?
 if in_extended_col
 csv[-1] << @col_sep # will be replaced with a @row_sep after the parts.each loop
 else
 csv << nil
 end
 end
 # This loop is the hot path of csv parsing. Some things may be non-dry
 # for a reason. Make sure to benchmark when refactoring.
 parts.each do |part|
 if in_extended_col
 # If we are continuing a previous column
 if part[-1] == @quote_char && part.count(@quote_char) % 2 != 0
 # extended column ends
 csv.last << part[0..-2]
 raise MalformedCSVError if csv.last =~ @parsers[:stray_quote]
 csv.last.gsub!(@quote_char * 2, @quote_char)
 in_extended_col = false
 else
 csv.last << part
 csv.last << @col_sep
 end
 elsif part[0] == @quote_char
 # If we are staring a new quoted column
 if part[-1] != @quote_char || part.count(@quote_char) % 2 != 0
 # start an extended column
 csv << part[1..-1]
 csv.last << @col_sep
 in_extended_col = true
 else
 # regular quoted column
 csv << part[1..-2]
 raise MalformedCSVError if csv.last =~ @parsers[:stray_quote]
 csv.last.gsub!(@quote_char * 2, @quote_char)
 end
 elsif part =~ @parsers[:quote_or_nl]
 # Unquoted field with bad characters.
 if part =~ @parsers[:nl_or_lf]
 raise MalformedCSVError, "Unquoted fields do not allow " +
 "\\r or \\n (line #{lineno + 1})."
 else
 raise MalformedCSVError, "Illegal quoting on line #{lineno + 1}."
 end
 else
 # Regular ole unquoted field.
 csv << (part.empty? ? nil : part)
 end
 end
 # Replace tacked on @col_sep with @row_sep if we are still in an extended
 # column.
 csv[-1][-1] = @row_sep if in_extended_col
 if in_extended_col
 # if we're at eof?(), a quoted field wasn't closed...
 if @io.eof?
 raise MalformedCSVError,
 "Unclosed quoted field on line #{lineno + 1}."
 elsif @field_size_limit and csv.last.size >= @field_size_limit
 raise MalformedCSVError, "Field size exceeded on line #{lineno + 1}."
 end
 # otherwise, we need to loop and pull some more data to complete the row
 else
 @lineno += 1
 # save fields unconverted fields, if needed...
 unconverted = csv.dup if @unconverted_fields
 # convert fields, if needed...
 csv = convert_fields(csv) unless @use_headers or @converters.empty?
 # parse out header rows and handle CSV::Row conversions...
 csv = parse_headers(csv) if @use_headers
 # inject unconverted fields and accessor, if requested...
 if @unconverted_fields and not csv.respond_to? :unconverted_fields
 add_unconverted_fields(csv, unconverted)
 end
 # return the results
 break csv
 end
 end
end
 
Also aliased as: gets, readline
skip_blanks?() click to toggle source

Returns true blank lines are skipped by the parser. See ::new for details.

 
 # File csv.rb, line 1645
def skip_blanks?() @skip_blanks end
 
unconverted_fields?() click to toggle source

Returns true if unconverted_fields() to parsed results. See ::new for details.

 
 # File csv.rb, line 1614
def unconverted_fields?() @unconverted_fields end
 
write_headers?() click to toggle source

Returns true if headers are written in output. See ::new for details.

 
 # File csv.rb, line 1629
def write_headers?() @write_headers end
 

AltStyle によって変換されたページ (->オリジナル) /