Skip to content

Commit 7c3f557

Browse files
knoll3Keyang
authored andcommitted
Fix typos in Readme (#299)
1 parent 92dc7ac commit 7c3f557

1 file changed

Lines changed: 18 additions & 18 deletions

File tree

readme.md

Lines changed: 18 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -195,7 +195,7 @@ $ csvtojson
195195
# API
196196

197197
* [Parameters](#parameters)
198-
* [Asynchronouse Result Process](#asynchronouse-result-process)
198+
* [Asynchronous Result Process](#asynchronous-result-process)
199199
* [Events](#events)
200200
* [Hook / Transform](#hook--transform)
201201
* [Nested JSON Structure](#nested-json-structure)
@@ -208,7 +208,7 @@ $ csvtojson
208208

209209
`require('csvtojson')` returns a constructor function which takes 2 arguments:
210210

211-
1. parser parameters
211+
1. Parser parameters
212212
2. Stream options
213213

214214
```js
@@ -230,16 +230,16 @@ const converter=csv({
230230
Following parameters are supported:
231231

232232
* **output**: The format to be converted to. "json" (default) -- convert csv to json. "csv" -- convert csv to csv row array. "line" -- convert csv to csv line string
233-
* **delimiter**: delimiter used for seperating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
234-
* **quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" wont be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
233+
* **delimiter**: delimiter used for separating columns. Use "auto" if delimiter is unknown in advance, in this case, delimiter will be auto-detected (by best attempt). Use an array to give a list of potential delimiters e.g. [",","|","$"]. default: ","
234+
* **quote**: If a column contains delimiter, it is able to use quote character to surround the column content. e.g. "hello, world" won't be split into two columns while parsing. Set to "off" will ignore all quotes. default: " (double quote)
235235
* **trim**: Indicate if parser trim off spaces surrounding column content. e.g. " content " will be trimmed to "content". Default: true
236236
* **checkType**: This parameter turns on and off whether check field type. Default is false. (The default is `true` if version < 1.1.4)
237237
* **ignoreEmpty**: Ignore the empty value in CSV columns. If a column value is not given, set this to true to skip them. Default: false.
238238
* **fork (experimental)**: Fork another process to parse the CSV stream. It is effective if many concurrent parsing sessions for large csv files. Default: false
239239
* **noheader**:Indicating csv data has no header row and first row is data row. Default is false. See [header row](#header-row)
240240
* **headers**: An array to specify the headers of CSV data. If --noheader is false, this value will override CSV header row. Default: null. Example: ["my field","name"]. See [header row](#header-row)
241241
* **flatKeys**: Don't interpret dots (.) and square brackets in header fields as nested object or array identifiers at all (treat them like regular characters for JSON field identifiers). Default: false.
242-
* **maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser wont consume memory. default: 0
242+
* **maxRowLength**: the max character a csv row could have. 0 means infinite. If max number exceeded, parser will emit "error" of "row_exceed". if a possibly corrupted csv data provided, give it a number like 65535 so the parser won't consume memory. default: 0
243243
* **checkColumn**: whether check column number of a row is the same as headers. If column number mismatched headers number, an error of "mismatched_column" will be emitted.. default: false
244244
* **eol**: End of line character. If omitted, parser will attempt to retrieve it from the first chunks of CSV data.
245245
* **escape**: escape character used in quoted column. Default is double quote (") according to RFC4108. Change to back slash (\\) or other chars for your own case.
@@ -250,18 +250,18 @@ Following parameters are supported:
250250

251251
All parameters can be used in Command Line tool.
252252

253-
## Asynchronouse Result Process
253+
## Asynchronous Result Process
254254

255-
Since `v2.0.0`, asynchronouse processing has been fully supported.
255+
Since `v2.0.0`, asynchronous processing has been fully supported.
256256

257-
e.g. Process each JSON result asynchronousely.
257+
e.g. Process each JSON result asynchronously.
258258

259259
```js
260260
csv().fromFile(csvFile)
261261
.subscribe((json)=>{
262262
return new Promise((resolve,reject)=>{
263263
// Async operation on the json
264-
// dont forget to call resolve and reject
264+
// don't forget to call resolve and reject
265265
})
266266
})
267267
```
@@ -292,7 +292,7 @@ csv()
292292

293293
### data
294294

295-
`data` event is emitted for each parsed CSV line. It passes buffer of strigified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
295+
`data` event is emitted for each parsed CSV line. It passes buffer of stringified JSON in [ndjson format](http://ndjson.org/) unless `objectMode` is set true in stream option.
296296

297297
```js
298298
const csv=require('csvtojson')
@@ -304,7 +304,7 @@ csv()
304304
```
305305

306306
### error
307-
`error` event is emitted if there is any errors happened during parsing.
307+
`error` event is emitted if any errors happened during parsing.
308308

309309
```js
310310
const csv=require('csvtojson')
@@ -348,7 +348,7 @@ csv()
348348
return newData;
349349
})
350350

351-
// asynchronouse
351+
// asynchronous
352352
csv()
353353
.preRawData((csvRawData)=>{
354354
return new Promise((resolve,reject)=>{
@@ -361,7 +361,7 @@ csv()
361361

362362
### CSV File Line Hook
363363

364-
the function is called each time a file line has been parsed in csv stream. the `lineIdx` is the file line number in the file starting with 0.
364+
The function is called each time a file line has been parsed in csv stream. The `lineIdx` is the file line number in the file starting with 0.
365365

366366
```js
367367
const csv=require('csvtojson')
@@ -374,7 +374,7 @@ csv()
374374
return fileLineString
375375
})
376376

377-
// asynchronouse
377+
// asynchronous
378378
csv()
379379
.preFileLine((fileLineString, lineIdx)=>{
380380
return new Promise((resolve,reject)=>{
@@ -396,7 +396,7 @@ const csv=require('csvtojson')
396396
csv()
397397
.subscribe((jsonObj,index)=>{
398398
jsonObj.myNewKey='some value'
399-
// OR asynchronousely
399+
// OR asynchronously
400400
return new Promise((resolve,reject)=>{
401401
jsonObj.myNewKey='some value';
402402
resolve();
@@ -487,7 +487,7 @@ csv({flatKeys:true})
487487
1. First row of csv source. Use first row of csv source as header row. This is default.
488488
2. If first row of csv source is header row but it is incorrect and need to be replaced. Use `headers:[]` and `noheader:false` parameters.
489489
3. If original csv source has no header row but the header definition can be defined. Use `headers:[]` and `noheader:true` parameters.
490-
4. If original csv source has no header row and the header definition is unknow. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
490+
4. If original csv source has no header row and the header definition is unknown. Use `noheader:true`. This will automatically add `fieldN` header to csv cells
491491

492492

493493
### Example
@@ -579,7 +579,7 @@ csv({
579579

580580
Above example will convert `birthday` column into a js `Date` object.
581581

582-
the returned value will be used in result JSON object. returning `undefined` will not change result JSON object.
582+
The returned value will be used in result JSON object. Returning `undefined` will not change result JSON object.
583583

584584
### Flat key column
585585

@@ -615,7 +615,7 @@ Very much appreciate any types of donation and support.
615615

616616
1. Fork the repo to your github account
617617
2. Checkout code from your github repo to your local machine.
618-
3. Make code changes and dont forget add related tests.
618+
3. Make code changes and don't forget add related tests.
619619
4. Run `npm test` locally before pushing code back.
620620
5. Create a [Pull Request](https://help.github.com/articles/creating-a-pull-request/) on github.
621621
6. Code review and merge

0 commit comments

Comments
 (0)