You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: readme.md
+20-21Lines changed: 20 additions & 21 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -26,7 +26,7 @@
26
26
27
27
`csvtojson` has released version `2.0.0`.
28
28
* To upgrade to v2, please follow [upgrading guide](https://github.com/Keyang/node-csvtojson/blob/master/docs/csvtojson-v2.md)
29
-
* If you are looking for documentation for `v1`, open [this page](https://github.com/Keyang/node-csvtojson/blob/master/docs/readme.v1.md).
29
+
* If you are looking for documentation for `v1`, open [this page](https://github.com/Keyang/node-csvtojson/blob/master/docs/readme.v1.md)
30
30
31
31
It is still able to use v1 with `csvtojson@2.0.0`
32
32
@@ -195,20 +195,19 @@ $ csvtojson
195
195
# API
196
196
197
197
*[Parameters](#parameters)
198
-
*[Asynchronouse Result Process](#asynchronouse-result-process)
198
+
*[Asynchronous Result Process](#asynchronous-result-process)
199
199
*[Events](#events)
200
200
*[Hook / Transform](#hook--transform)
201
201
*[Nested JSON Structure](#nested-json-structure)
202
202
*[Header Row](#header-row)
203
-
*[Multi CPU Core Support(experimental) ](#multi-cpu-core-support)
204
203
*[Column Parser](#column-parser)
205
204
206
205
207
206
## Parameters
208
207
209
208
`require('csvtojson')` returns a constructor function which takes 2 arguments:
210
209
211
-
1.parser parameters
210
+
1.Parser parameters
212
211
2. Stream options
213
212
214
213
```js
@@ -230,16 +229,16 @@ const converter=csv({
230
229
Following parameters are supported:
231
230
232
231
***output**: The format to be converted to. "json" (default) -- convert csv to json. "csv" -- convert csv to csv row array. "line" -- convert csv to csv line string
233
-
***delimiter**: delimiter used for seperating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
234
-
***quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" wont be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
232
+
***delimiter**: delimiter used for separating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
233
+
***quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" won't be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
235
234
***trim**: Indicate if parser trim off spaces surrounding column content. e.g. " content " will be trimmed to "content". Default: true
236
235
***checkType**: This parameter turns on and off whether check field type. Default is false. (The default is `true` if version < 1.1.4)
237
236
***ignoreEmpty**: Ignore the empty value in CSV columns. If a column value is not given, set this to true to skip them. Default: false.
238
237
***fork (experimental)**: Fork another process to parse the CSV stream. It is effective if many concurrent parsing sessions for large csv files. Default: false
239
238
***noheader**:Indicating csv data has no header row and first row is data row. Default is false. See [header row](#header-row)
240
239
***headers**: An array to specify the headers of CSV data. If --noheader is false, this value will override CSV header row. Default: null. Example: ["my field","name"]. See [header row](#header-row)
241
240
***flatKeys**: Don't interpret dots (.) and square brackets in header fields as nested object or array identifiers at all (treat them like regular characters for JSON field identifiers). Default: false.
242
-
***maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser wont consume memory. default: 0
241
+
***maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser won't consume memory. default: 0
243
242
***checkColumn**: whether check column number of a row is the same as headers. If column number mismatched headers number, an error of "mismatched_column" will be emitted.. default: false
244
243
***eol**: End of line character. If omitted, parser will attempt to retrieve it from the first chunks of CSV data.
245
244
***escape**: escape character used in quoted column. Default is double quote (") according to RFC4108. Change to back slash (\\) or other chars for your own case.
@@ -252,18 +251,18 @@ Following parameters are supported:
252
251
***needEmitAll**: Parser will build JSON result is `.then` is called (or await is used). If this is not desired, set this to false. Default is true.
253
252
All parameters can be used in Command Line tool.
254
253
255
-
## Asynchronouse Result Process
254
+
## Asynchronous Result Process
256
255
257
-
Since `v2.0.0`, asynchronouse processing has been fully supported.
256
+
Since `v2.0.0`, asynchronous processing has been fully supported.
258
257
259
-
e.g. Process each JSON result asynchronousely.
258
+
e.g. Process each JSON result asynchronously.
260
259
261
260
```js
262
261
csv().fromFile(csvFile)
263
262
.subscribe((json)=>{
264
263
returnnewPromise((resolve,reject)=>{
265
264
// Async operation on the json
266
-
//dont forget to call resolve and reject
265
+
//don't forget to call resolve and reject
267
266
})
268
267
})
269
268
```
@@ -294,7 +293,7 @@ csv()
294
293
295
294
### data
296
295
297
-
`data` event is emitted for each parsed CSV line. It passes buffer of strigified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
296
+
`data` event is emitted for each parsed CSV line. It passes buffer of stringified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
298
297
299
298
```js
300
299
constcsv=require('csvtojson')
@@ -306,7 +305,7 @@ csv()
306
305
```
307
306
308
307
### error
309
-
`error` event is emitted if there is any errors happened during parsing.
308
+
`error` event is emitted if any errors happened during parsing.
310
309
311
310
```js
312
311
constcsv=require('csvtojson')
@@ -350,7 +349,7 @@ csv()
350
349
return newData;
351
350
})
352
351
353
-
//asynchronouse
352
+
//asynchronous
354
353
csv()
355
354
.preRawData((csvRawData)=>{
356
355
returnnewPromise((resolve,reject)=>{
@@ -363,7 +362,7 @@ csv()
363
362
364
363
### CSV File Line Hook
365
364
366
-
the function is called each time a file line has been parsed in csv stream. the`lineIdx` is the file line number in the file starting with 0.
365
+
The function is called each time a file line has been parsed in csv stream. The`lineIdx` is the file line number in the file starting with 0.
1. First row of csv source. Use first row of csv source as header row. This is default.
490
489
2. If first row of csv source is header row but it is incorrect and need to be replaced. Use `headers:[]` and `noheader:false` parameters.
491
490
3. If original csv source has no header row but the header definition can be defined. Use `headers:[]` and `noheader:true` parameters.
492
-
4. If original csv source has no header row and the header definition is unknow. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
491
+
4. If original csv source has no header row and the header definition is unknown. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
493
492
494
493
495
494
### Example
@@ -581,7 +580,7 @@ csv({
581
580
582
581
Above example will convert `birthday` column into a js `Date` object.
583
582
584
-
the returned value will be used in result JSON object. returning`undefined` will not change result JSON object.
583
+
The returned value will be used in result JSON object. Returning`undefined` will not change result JSON object.
585
584
586
585
### Flat key column
587
586
@@ -617,7 +616,7 @@ Very much appreciate any types of donation and support.
617
616
618
617
1. Fork the repo to your github account
619
618
2. Checkout code from your github repo to your local machine.
620
-
3. Make code changes and dont forget add related tests.
619
+
3. Make code changes and don't forget add related tests.
621
620
4. Run `npm test` locally before pushing code back.
622
621
5. Create a [Pull Request](https://help.github.com/articles/creating-a-pull-request/) on github.
623
622
6. Code review and merge
@@ -675,7 +674,7 @@ If a module packager is preferred, just simply `require("csvtojson")`:
0 commit comments