Hi,
I was given following question as an assessment but unfortunately I was unable to answer. Can someone help ?
The question is :
The function given below is supposed to retrieve the substring between and including the ith and jth characters of the input string s. When i and j are out of range, ie. i,j< 0 ori,j >= s.length(), the function returns the string "<error>". The following table shows the expected and actual return values of the function for various inputs.
Test case | s | i | j | Expected return value | Actual return value |
1 | "quick" | 0 | 2 | "qui" | "qui" |
2 | "brown" | 2 | 1 | "ro" | "" |
3 | "fox" | 1 | 9 | "<error>" | "<error>" |
4 | "jumped" | -1 | 4 | "<error>" | "<error>" |
5 | "lazy" | 3 | 0 | "lazy" | "<error>" |
6 | "dog" | 1 | 1 | "o" | "o" |
Unfortunately, the function returns the wrong value for the 2<sup>nd</sup> and 5<sup>th</sup>test cases.
C# Version:
string SubString(string s, int i, int j)
{
int k;
string result;
try
{
k = s.Length;
if (i < 0 || j < 0 || i >= k || j >= k)
throw new Exception();
result = s.Substring(i, j + 1 - i);
}
catch (Exception e)
{
return "<error>";
}
return result;
}
a) Under what general conditions does the function return the wrong value?
b) Add some code to fix the function. Can you improve it? If so explain why your version is better.