On Unix/Linux, which command is designed to extract specific columns or fields from each line of a file or standard input?

Difficulty: Easy

Correct Answer: cut

Explanation:


Introduction / Context:
Processing delimited text (CSV, TSV, logs) often requires selecting particular fields. Unix pipelines favor small utilities that do one job well. The command that slices columns or character ranges directly from lines is concise and efficient for this purpose.


Given Data / Assumptions:

  • Input lines are structured (for example, comma- or tab-delimited).
  • You know the field positions and delimiter.
  • You want a pure extraction without pattern filtering or merging.


Concept / Approach:
The cut command extracts selected fields (-f) or character positions (-c) using a specified delimiter (-d). It is ideal for simple, position-based selection. For more complex parsing (quotes, variable delimiters), consider awk, but for straightforward column extraction, cut is the canonical tool.


Step-by-Step Solution:

Select columns 1 and 3 from comma-separated data: cut -d ',' -f 1,3 data.csvExtract characters 1-10 from each line: cut -c 1-10 file.txtCombine with pipes: cat access.log | cut -d ' ' -f 1Redirect results to a new file if needed: ... > subset.txt


Verification / Alternative check:
Use head and echo to preview a few lines, then run cut and manually verify fields align with the delimiter you chose. If fields contain the delimiter inside quotes, switch to awk for robustness.


Why Other Options Are Wrong:

  • grep: Finds lines matching patterns; does not extract fixed fields.
  • paste: Merges lines/columns from files; the inverse of cut in many workflows.
  • cat: Displays contents; no field selection.
  • None of the above: Incorrect because cut is purpose-built for column extraction.


Common Pitfalls:
Forgetting to set the delimiter with -d, assuming tabs by default when input is comma-separated, or expecting cut to handle quoted CSV intricacies.


Final Answer:
cut

Discussion & Comments

No comments yet. Be the first to comment!
Join Discussion