-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathJTompkinsUpdates.html
executable file
·83 lines (73 loc) · 7.28 KB
/
JTompkinsUpdates.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
<!DOCTYPE html>
<html>
<head>
<title> CS 499 Group 7: Supreme Court </title>
<meta charset="UTF-8">
<!-- Put JS headers here -->
<!-- CSS Stylesheets -->
<link rel="stylesheet" href="main.css" type="text/css">
</head>
<body>
<div id="john-tompkins-updates-page" class="container">
<div class="header">
<h1>John Tompkins Weekly Updates</h1>
</div>
<br><br><br>
<div class="content">
<div class="links">
<a href="index.html">Home</a>
<br><br>
<a href="introduction.html">Introduction</a>
<br><br>
<a href="requirements.html">Requirements</a>
<br><br>
<a href="updates.html">Updates</a>
<br><br>
<a href="schedule.html">Schedule</a>
<br><br>
<a href="design.html">Design</a>
<br><br>
<a href="testing.html">Testing</a>
<br><br>
<a href="useCases.html"> Use Cases</a>
<br><br>
<a href="designAndImplementation.html">Design Considerations/Implementation Issues</a>
<br><br>
<a href="enhancementsMaintenance.html">Future Enhancements/Maintenance</a>
<br><br>
<a href="conclusions.html">Conclusions</a>
<br><br>
<a href="installation.html">Installation</a>
<br><br>
<a href="references.html"> References</a>
<br><br>
</div>
<div class="text">
<h2> Feb. 4 - Feb. 11 </h2>
<p>This week was started with a team meeting where we discussed specifics of the project, such as what libraries and technologies we will use. I then set up two github repositories. One repository is for us to make changes to the webpage for the project. The other repository is for the project itself. We went with github because of the universality and the ease of collaboration. I then corresponded with our customers to arrange a meeting for Friday, in which we agreed on project requirements. I then updated the requirements on the webpage to reflect these agreements.</p>
<h2> Feb. 12 - Feb. 18 </h2>
<p>This week, we met as a team and worked on deciding specifics of implementation of the project. I did a lot of research on different APIs that we could use to access different news articles online. We narrowed it down to NEWS API, which lets users search articles by keyword from over 500 sources. This will be very helpful for the project. I also set up the calendar for the project website, and I added some specifics on the requirements page. </p>
<h2> Feb. 19 - Feb. 25 </h2>
<p>This week, we met with our customers to update them on the status of the project. The meeting went very well and we are goingto update so that they know when we present our project for the midterm presentations. I investigated the NEWS api more and got some simple examples working in python to request news stories involving the Supreme Court. I also scheduled the practice with Presentation U.</p>
<h2>Feb. 26 - Mar. 4</h2>
<p>This week, most of the work on the project was put towards preparing for the presentation this Monday. I scheduled the presentation U appointment for our team. We met and came up with the presentation early in the week so that we would be ready to present at our practice. The practice went well, and we got good actionable advice that we used to tweak the presentation.</p>
<h2>Mar 5. - Mar 11</h2>
<p>This week consisted of primarily preparing the finishing touches for the presentation that we presented on Monday. Aside from that, a lot of effort was put towards the design. I was in charge of designing the modules and showing the data flow between modules. I had to define what were the modules, how they operated, and what was exchanged between modules. This was a very interesting problem to tackle. I also worked on defining a schema for the database that will hold the article collection. Aside from this, I did some research into how to train the agent for the article classification.</p>
<h2>Mar 6. - Mar 18</h2>
<p>This week being spring break and me being in a place with no internet, I took a break most of the week from working on assignments related to the senior design project. However, whenever I returned, I began work on the scraper that scrapes articles from sections of websites. I was able to successfully scrape the CNN (Cable News Network) web site centered around Supreme Court articles, and I have a plan to adapt this method for the other sites in which we need this functionality.</p>
<h2>Mar 19 - Mar 25 </h2>
<p>This week consisted of a lot of setup tasks. We set up our server with which we will develop and host our application. We decided to switch to an ec2 instance hosted on Amazon Web Services, which was very straightforward to set up. I handled setting up with various accounts for the server and installed Apache MySQL and PHP on the machine. I also worked on the various scraper functions for different news sites. I communicated with the customer to determine the sites that they wanted scraped. Some of these sites did not have Supreme Court sections so they could not be properly scraped. Our group got together on Sunday and had a work day where we set up an initial prototype of our program. I was very impressed with the amount of work that was accomplished.</p>
<h2>Mar 25 - Apr 1 </h2>
<p>We met with the customers this week. This allowed us to update them on the status of the project and clarify what we set out to accomplish. I felt like the meeting went very well; our customers felt a lot more confident about the project after meeting with us. After that, we spent a lot of time working on the testing plan for the rest of the semester. I worked a lot with the unit testing section and system testing section of the plan. We have been in contact with the UK IT department to try to come up with a solution to host the service after we finish the project. Working on AWS ensures that we can get the same setup because of the uniformity.</p>
<h2>Apr 2 - Apr 8</h2>
<p>This week, I modified some settings on the development server to try to get the scraper functions and news API capability functioning on the collector. The collector now appears to be adding articles from the selected news sources and from NEWSAPI. I also added a fair amount of documentation explaining how some of the code that I wrote works and ensuring our repo was properly documented.</p>
<h2>Apr 9 - Apr 15</h2>
<p>We had a meeting with the customers this week. We were able to demonstrate the working prototype of our project and the customers were glad to see the progress made. The major obstacles facing the project now have to do with certain paywall issues on websites. We plan to add statistics so that we can troubleshoot which websites are causing the problems with paywalls. The paywall problem is an interesting one because like the sections of different news websites, evey paywall is implemented in a different way, with some being easier to get around than others. We also met as a group to work on the presentation. We compiled a slide deck so that we will be ready to practice present on Monday</p>
<h2>Apr 16 - Apr 22</h2>
<p>This week, we practiced presenting our presentation. We worked out a lot of the kinks and I think we will be good to present next week. We deliver the project to our customers on Monday. This past week, I also looked into trying to create a user interface to modify how often the article collection runs.</p>
</div>
</div>
</div>
</div>
</body>
</html>