For the lesson 7 I used following steps:

1) I got to link to page that I need to download links. I turned right click on the page and used view page source. Then, I selected all that I can copy all page information and text which I wanted to put in editor.

2) I did copy paste in editor (Sumblime Text 3).

3) Then I made some modification of links with regular expressions to have proper links. This is a very important step which should be done in order to download all links.

4) I created one file with all 1349 links in txt. format. The name of file was source.4txt.

5) Then I opened Eingabeaufforderung, while Windows powershell was not working at my computer with this command. (I do not know the reason).

6) I created new folder wget-activehistory(my folder for dispatch) with the command mkdir wget-activehistory. In this folder I wanted to download all links.

7) All links I downloaded with following command wget -i source4.txt -P./wget-activehistory/ -nc. This worked for me and in few minutes all links were collected in the subfolder wget-activehistory.