Skip to content

Commit bf20812

Browse files
committed
Merge branch 'master' of github.com:Keyang/node-csvtojson
2 parents 306c123 + 98fe0f3 commit bf20812

6 files changed

Lines changed: 47 additions & 26 deletions

File tree

docs/csvtojson-v2.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -222,15 +222,15 @@ const jsonArr=await csv().fromFile(myFile);
222222

223223
```js
224224
csv({
225-
ignoreColumn:["gender","age"]
225+
ignoreColumns:["gender","age"]
226226
})
227227
```
228228

229229
**Now**
230230

231231
```js
232232
csv({
233-
ignoreColumn: /gender|age/
233+
ignoreColumns: /gender|age/
234234
})
235235
```
236236

readme.md

Lines changed: 20 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626

2727
`csvtojson` has released version `2.0.0`.
2828
* To upgrade to v2, please follow [upgrading guide](https://github.com/Keyang/node-csvtojson/blob/master/docs/csvtojson-v2.md)
29-
* If you are looking for documentation for `v1`, open [this page](https://github.com/Keyang/node-csvtojson/blob/master/docs/readme.v1.md).
29+
* If you are looking for documentation for `v1`, open [this page](https://github.com/Keyang/node-csvtojson/blob/master/docs/readme.v1.md)
3030

3131
It is still able to use v1 with `csvtojson@2.0.0`
3232

@@ -195,20 +195,19 @@ $ csvtojson
195195
# API
196196

197197
* [Parameters](#parameters)
198-
* [Asynchronouse Result Process](#asynchronouse-result-process)
198+
* [Asynchronous Result Process](#asynchronous-result-process)
199199
* [Events](#events)
200200
* [Hook / Transform](#hook--transform)
201201
* [Nested JSON Structure](#nested-json-structure)
202202
* [Header Row](#header-row)
203-
* [Multi CPU Core Support(experimental) ](#multi-cpu-core-support)
204203
* [Column Parser](#column-parser)
205204

206205

207206
## Parameters
208207

209208
`require('csvtojson')` returns a constructor function which takes 2 arguments:
210209

211-
1. parser parameters
210+
1. Parser parameters
212211
2. Stream options
213212

214213
```js
@@ -230,16 +229,16 @@ const converter=csv({
230229
Following parameters are supported:
231230

232231
* **output**: The format to be converted to. "json" (default) -- convert csv to json. "csv" -- convert csv to csv row array. "line" -- convert csv to csv line string
233-
* **delimiter**: delimiter used for seperating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
234-
* **quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" wont be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
232+
* **delimiter**: delimiter used for separating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
233+
* **quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" won't be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
235234
* **trim**: Indicate if parser trim off spaces surrounding column content. e.g. " content " will be trimmed to "content". Default: true
236235
* **checkType**: This parameter turns on and off whether check field type. Default is false. (The default is `true` if version < 1.1.4)
237236
* **ignoreEmpty**: Ignore the empty value in CSV columns. If a column value is not given, set this to true to skip them. Default: false.
238237
* **fork (experimental)**: Fork another process to parse the CSV stream. It is effective if many concurrent parsing sessions for large csv files. Default: false
239238
* **noheader**:Indicating csv data has no header row and first row is data row. Default is false. See [header row](#header-row)
240239
* **headers**: An array to specify the headers of CSV data. If --noheader is false, this value will override CSV header row. Default: null. Example: ["my field","name"]. See [header row](#header-row)
241240
* **flatKeys**: Don't interpret dots (.) and square brackets in header fields as nested object or array identifiers at all (treat them like regular characters for JSON field identifiers). Default: false.
242-
* **maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser wont consume memory. default: 0
241+
* **maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser won't consume memory. default: 0
243242
* **checkColumn**: whether check column number of a row is the same as headers. If column number mismatched headers number, an error of "mismatched_column" will be emitted.. default: false
244243
* **eol**: End of line character. If omitted, parser will attempt to retrieve it from the first chunks of CSV data.
245244
* **escape**: escape character used in quoted column. Default is double quote (") according to RFC4108. Change to back slash (\\) or other chars for your own case.
@@ -252,18 +251,18 @@ Following parameters are supported:
252251
* **needEmitAll**: Parser will build JSON result is `.then` is called (or await is used). If this is not desired, set this to false. Default is true.
253252
All parameters can be used in Command Line tool.
254253

255-
## Asynchronouse Result Process
254+
## Asynchronous Result Process
256255

257-
Since `v2.0.0`, asynchronouse processing has been fully supported.
256+
Since `v2.0.0`, asynchronous processing has been fully supported.
258257

259-
e.g. Process each JSON result asynchronousely.
258+
e.g. Process each JSON result asynchronously.
260259

261260
```js
262261
csv().fromFile(csvFile)
263262
.subscribe((json)=>{
264263
return new Promise((resolve,reject)=>{
265264
// Async operation on the json
266-
// dont forget to call resolve and reject
265+
// don't forget to call resolve and reject
267266
})
268267
})
269268
```
@@ -294,7 +293,7 @@ csv()
294293

295294
### data
296295

297-
`data` event is emitted for each parsed CSV line. It passes buffer of strigified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
296+
`data` event is emitted for each parsed CSV line. It passes buffer of stringified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
298297

299298
```js
300299
const csv=require('csvtojson')
@@ -306,7 +305,7 @@ csv()
306305
```
307306

308307
### error
309-
`error` event is emitted if there is any errors happened during parsing.
308+
`error` event is emitted if any errors happened during parsing.
310309

311310
```js
312311
const csv=require('csvtojson')
@@ -350,7 +349,7 @@ csv()
350349
return newData;
351350
})
352351

353-
// asynchronouse
352+
// asynchronous
354353
csv()
355354
.preRawData((csvRawData)=>{
356355
return new Promise((resolve,reject)=>{
@@ -363,7 +362,7 @@ csv()
363362

364363
### CSV File Line Hook
365364

366-
the function is called each time a file line has been parsed in csv stream. the `lineIdx` is the file line number in the file starting with 0.
365+
The function is called each time a file line has been parsed in csv stream. The `lineIdx` is the file line number in the file starting with 0.
367366

368367
```js
369368
const csv=require('csvtojson')
@@ -376,7 +375,7 @@ csv()
376375
return fileLineString
377376
})
378377

379-
// asynchronouse
378+
// asynchronous
380379
csv()
381380
.preFileLine((fileLineString, lineIdx)=>{
382381
return new Promise((resolve,reject)=>{
@@ -398,7 +397,7 @@ const csv=require('csvtojson')
398397
csv()
399398
.subscribe((jsonObj,index)=>{
400399
jsonObj.myNewKey='some value'
401-
// OR asynchronousely
400+
// OR asynchronously
402401
return new Promise((resolve,reject)=>{
403402
jsonObj.myNewKey='some value';
404403
resolve();
@@ -489,7 +488,7 @@ csv({flatKeys:true})
489488
1. First row of csv source. Use first row of csv source as header row. This is default.
490489
2. If first row of csv source is header row but it is incorrect and need to be replaced. Use `headers:[]` and `noheader:false` parameters.
491490
3. If original csv source has no header row but the header definition can be defined. Use `headers:[]` and `noheader:true` parameters.
492-
4. If original csv source has no header row and the header definition is unknow. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
491+
4. If original csv source has no header row and the header definition is unknown. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
493492

494493

495494
### Example
@@ -581,7 +580,7 @@ csv({
581580

582581
Above example will convert `birthday` column into a js `Date` object.
583582

584-
the returned value will be used in result JSON object. returning `undefined` will not change result JSON object.
583+
The returned value will be used in result JSON object. Returning `undefined` will not change result JSON object.
585584

586585
### Flat key column
587586

@@ -617,7 +616,7 @@ Very much appreciate any types of donation and support.
617616

618617
1. Fork the repo to your github account
619618
2. Checkout code from your github repo to your local machine.
620-
3. Make code changes and dont forget add related tests.
619+
3. Make code changes and don't forget add related tests.
621620
4. Run `npm test` locally before pushing code back.
622621
5. Create a [Pull Request](https://help.github.com/articles/creating-a-pull-request/) on github.
623622
6. Code review and merge
@@ -675,7 +674,7 @@ If a module packager is preferred, just simply `require("csvtojson")`:
675674
var csv=require("csvtojson");
676675

677676
// or with import
678-
import * as csv from "csvtojson");
677+
import * as csv from "csvtojson";
679678

680679
//then use csv as normal
681680
```

src/rowSplit.ts

Lines changed: 7 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -75,9 +75,15 @@ export class RowSplit {
7575
continue;
7676
} else if (e.indexOf(quote) !== -1) {
7777
let count = 0;
78+
let prev = "";
7879
for (const c of e) {
79-
if (c === quote) {
80+
// count quotes only if previous character is not escape char
81+
if (c === quote && prev !== this.escape) {
8082
count++;
83+
prev = "";
84+
} else {
85+
// save previous char to temp variable
86+
prev = c;
8187
}
8288
}
8389
if (count % 2 === 1) {
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
id,raw
2+
0,"\"hello,\"world\""

test/testCSVConverter2.ts

Lines changed: 16 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -294,7 +294,22 @@ describe("testCSVConverter2", function () {
294294
rs.pipe(test_converter);
295295
});
296296

297-
it("should output ndjson format", function (done) {
297+
it("should process escape chars when delimiter is between escaped quotes", function(done) {
298+
var test_converter = new Converter({
299+
escape: "\\"
300+
});
301+
302+
var testData =
303+
__dirname + "/data/dataWithSlashEscapeAndDelimiterBetweenQuotes";
304+
var rs = fs.createReadStream(testData);
305+
test_converter.then(function(res) {
306+
assert.equal(res[0].raw, '"hello,"world"');
307+
done();
308+
});
309+
rs.pipe(test_converter);
310+
});
311+
312+
it("should output ndjson format", function(done) {
298313
var conv = new Converter();
299314
conv.fromString("a,b,c\n1,2,3\n4,5,6")
300315
.on("data", function (d) {

v1/core/workerMgr.js

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,6 @@ var eom1 = "\x0e";
44
var eom2 = "\x0f";
55
var CSVError = require('./CSVError');
66
function workerMgr() {
7-
var spawn = require("child_process").spawn;
87
var exports = {
98
initWorker: initWorker,
109
sendWorker: sendWorker,

0 commit comments

Comments
 (0)