## provide a method that can calculate their winning percentage.

it can be in guy but I need it ti be injavafx please.it should contain 4 clasesThe “Team” class:Each object of this class represents one team and records their name andwin-lossrecord. A method should be provided that can calculate their winning percentage. Thewinning percentage should display as 0.000 when the team has played zero games. The team’sname usually doesn’t need to change, buta set of methodsshould beprovided to update theteam’s record.2.The “Game” class: Each object of this class represents a single game. The class object shouldrecord the teams that participated and the final score. When the score is assigned, a winnershould be determined, and the record of each participant should be updated appropriately.Thisimplies that the Game object needs to have access to the Team objects representing theparticipants, not just their names.3.The “League” class:The leagueobject (notice that it’s singular)contains the lists of teams, and alist of all games. The list of teams will not change once the league is configured so a simplearray should suffice. However, the list of games will grow and change frequentlyso an ArrayListwould be a better choice here.Most of the functionality of the program listed above will have acorresponding method here
The “App” class: This is the class containing the main method. All user input and output must bedone through this class.

### Prove that the hypothesis class of all conjunctions over d variables is PAC learnable and bound its sample complexity.

1.  In this question, we study the hypothesis class of Boolean conjunctions defined as follows. The instance space is X ={0,1}d and the label set is Y ={0,1}. A literal over the variables x1, . . ., xd is a….

### Show that for every probability distribution D, the Bayes optimal predictor fD is optimal, in the sense that for every classifier g from X to {0,1}, LD( fD) ≤ LD(g).

1. Let H be a hypothesis class of binary classifiers. Show that if H is agnostic PAC learnable, then His PAC learnable as well. Furthermore, if A is a successful agnostic PAC learner for H, then A is also a….

### . Show that the algorithm just described satisfies the requirements for being a RP solver for ERMH.

1. On the basis of the preceding, prove that for any k ≥ 3, the ERMHnk problem is NP-hard. 2 In this exercise we show that hardness of solving the ERM problem is equivalent….