1. Show that the resulting learning problem is convex-Lipschitz-bounded.

2. Show that no computable algorithm can learn the problem.

3. From Bounded Expected Risk to Agnostic PAC Learning: Let be an algorithm that guarantees the following: If ≥ mH(_) then for every distribution it holds that E SDm [LD(A(S))]≤ min hH LD(h)+_. _ Show that for every δ ∈ (01), if ≥ mH(_ δ) then with probability of at least 1−δ it holds that LD(A(S))≤ minhH LD(h)+_Hint: Observe that the random variable LD(A(S))−minhH LD(h) is nonnegative and rely on Markov’s inequality.

_ For every δ ∈ (01) let

mH(_, δ) = mH(_/2)_log2 (1)_+

 

log(4)+log(_log2 (1)_)

_2

 

. Suggest a procedure that agnostic PAC learns the problem with sample complexity of mH(_, δ), assuming that the loss function is bounded by 1. Hint: Let = _log2 (1)_. Divide the data into +1 chunks, where each of the first chunks is of size mH(_/2) examples. Train the first chunks using A. On the basis of the previous question argue that the probability that for all of these chunks we have LD(A(S))>minhH LD(h)+is at most 2−δ/2. Finally, use the last chunk as a validation set.

Found something interesting ?

• On-time delivery guarantee
• PhD-level professional writers
• Free Plagiarism Report

• 100% money-back guarantee
• Absolute Privacy & Confidentiality
• High Quality custom-written papers

Related Model Questions

Feel free to peruse our college and university model questions. If any our our assignment tasks interests you, click to place your order. Every paper is written by our professional essay writers from scratch to avoid plagiarism. We guarantee highest quality of work besides delivering your paper on time.

Grab your Discount!

25% Coupon Code: SAVE25
get 25% !!