Test 1

System Level Programming

Due July 2, 2020

1

System Level Programming Test 1 Due July 2, 2020

Coding ( 60 points ) ** You will verify this with screen shots of successful commands as well as a Submission of the final

result. PART 1 – 28 points

1. (2 points) Create a directory in the /Submissions folder in your home directory called ”Test1”. If you do not currently have a directory titled Submissions in your home directory ( ie, /Submissions ) create one.

2. (2 points) Use the ‘find‘ command to locate the ”data.tar” file in my directory.

3. (2 points) Copy the file called ”data.tar” from my directory into your Test1 directory inside of your Submission folder

4. (2 points) Extract the contents of ”data.tar” in your Submission folder

5. (3 points) Of the compressed files how many lines contain the phrase ”computer science” ( ignoring case ) ?

6. (3 points) Create a file name ”computer-science” that contains all of the lines, including the line numbers, in all the files that contain the phrase ”computer science”, ignoring case?

7. (3 points) Of the compressed files how many lines contain the words ”data” or ”structures” ( exact casing )?

8. (3 points) Create a file named ”data-structures” that contains all of the lines, including the line numbers, in all the files that contain the phrase words ”data” or ”structures” ( exact casing )?

9. (3 points) Of the compressed files how many lines contain any links to websites?

10. (3 points) Create a file named ”websites” that contains all of the lines, including the line numbers, in all the files that contain any links to websites?

11. (2 points) Combine the results from ”websites”, ”data-structures” and ”computer-science” using the cat command and store the results in a file called ”1.part”.

Page 2 of 4 0 points

System Level Programming Test 1 Due July 2, 2020

PART 2 ( Hint AWK is in a lot ) – 32 points

1. (4 points) Use a UNIX/Linux command to download the file at the following URL to your Submission directory:

https://data.ct.gov/api/views/rybz-nyjw/rows.csv

2. (4 points) Read and format the previous document:

***HINT It is a Comma Separated File that contains some strings denoted by quotation marks and contains the following columns:

ID, Date, DateType, Age, Sex, Race, ResidenceCity, ResidenceCounty, ResidenceState, DeathCity, DeathCounty, Location, LocationifOther, DescriptionofInjury, InjuryPlace, InjuryCity, InjuryCounty, InjuryState, COD, OtherSignifican, Heroin, Cocaine, Fentanyl, FentanylAnalogue, Oxycodone, Oxymor- phone, Ethanol, Hydrocodone, Benzodiazepine, Methadone, Amphet, Tramad, Morphine NotHeroin, Hydromorphone, Other, OpiateNOS, AnyOpioid, MannerofDeath, DeathCityGeo, ResidenceCityGeo, InjuryCityGeo

3. (4 points) Parse the previous document and remove all rows the don’t have a race or sex. Save output as ”2.parse”

4. (4 points) Create a new document Create a new document that only contains age, sex, race. Save output as ”2.asr”

5. (2 points) Find the Total number of a Male and Female participants, separately.

6. (2 points) Find the average age of a Male and Female participants, separately.

7. (2 points) Find the list of the unique races that are in the study.

8. (8 points) Find the following age statistic along racial and sex lines ( ie, While Male, White Female, Black Male, Black Female)

– Total number

– Average Age

– Min Age

– Max Age

9. (2 points) Save all files you used to create both parts of this and compress it all together and add it to iCollege in a tar archive.

Page 3 of 4 0 points

System Level Programming Test 1 Due July 2, 2020

Comprehension – 20 points

1. (5 points) What is the following bash code doing

curl ’http://domain.com/id/[1-151468]’ -o ’#1.html’

grep -oh ’http://pics.domain.com/pics/original/.*jpg’ *.html >urls.txt

sort -u urls.txt | wget -i-

If you would like to access in the server find ”zuckerburg.sh”

2. (15 points) Detail what the code is supposed to do in each of the 5 blocks ( 3 points each ) in the ”vimeoscript.sh”filelocatedinmydirectory.

Research – 20 points In this last port of the code you will research and detail how you would use UNIX/Linux to create the

following tools: 1) web scrapers to pull information from website, 2) email spam software that pulls emails address from websites and sends them spam emails , and 3) bots that grab information from live sites to make business decisions. MAKE SURE YOU CITE YOUR RESEARCH (Use IEEE citation format )

https://pitt.libguides.com/citationhelp/ieee

Page 4 of 4 80 points

Found something interesting ?

• On-time delivery guarantee
• PhD-level professional writers
• Free Plagiarism Report

• 100% money-back guarantee
• Absolute Privacy & Confidentiality
• High Quality custom-written papers

Related Model Questions

Feel free to peruse our college and university model questions. If any our our assignment tasks interests you, click to place your order. Every paper is written by our professional essay writers from scratch to avoid plagiarism. We guarantee highest quality of work besides delivering your paper on time.

Grab your Discount!

25% Coupon Code: SAVE25
get 25% !!