I created a simple class which serialises and deserialises Json and C# objects. How is it looking?
JsonHandler:
class JsonHandler
{
public T GetObjFromJson<T>(string filePath)
{
T retObj = default(T);
try
{
string jsonStr = string.Empty;
if (File.Exists(filePath))
{
byte[] bytes = File.ReadAllBytes(filePath);
jsonStr = Encoding.Default.GetString(bytes);
if (!string.IsNullOrWhiteSpace(jsonStr))
retObj = JsonConvert.DeserializeObject<T>(jsonStr);
}
}
catch (Exception ex)
{
string message = ex.Message;
}
return retObj;
}
public bool WriteObjAsJson(string filePath, object obj)
{
bool successfulWrite = false;
try
{
string jsonString = JsonConvert.SerializeObject(obj);
byte[] bytes = Encoding.Default.GetBytes(jsonString);
File.WriteAllBytes(filePath, bytes);
successfulWrite = true;
}
catch (Exception ex)
{
string message = ex.Message;
}
return successfulWrite;
}
}
The main things I'm concerned about is the reading all bytes. I guessed that is the most bulletproof solution? Ideally I'd be using this as part of a PCL in a mobile application (Xamarin) in the near future.
class Program
{
static void Main(string[] args)
{
string filePath = @"C:\Users\me\somewhere\textFile.txt";
var d = new DummyClass("Danny");
var handler = new JsonHandler();
bool successfulWrite = handler.WriteObjAsJson(filePath, d);
Console.WriteLine(successfulWrite);
DummyClass returnObj = handler.GetObjFromJson<DummyClass>(filePath);
if (returnObj != null)
Console.WriteLine($"Name: {returnObj.Name}");
}
}
class DummyClass
{
private string _name;
public string Name { get { return _name; } set { _name = value; } }
public DummyClass(string name)
{
Name = name;
}
}
I am only using the DummyClass
for some quick testing purposes. Not so concerned with the above two classes, and I only included them to show an example of how the JsonHandler
class will be used.
2 Answers 2
Here are some suggestions
- be consistent:
jsonStr
vs.jsonString
- be consistent:
GetObjFromJson
vsWriteObjAsJson
(maybe Read & Write instead) - use guard clauses instead of nesting
- avoid temporary variables like
successfulWrite
andretObj
Based on the suggestions, here is one alternative:
static class JsonHandler // class could be static
{
public static T GetObjFromJson<T>(string filePath)
{
// use a guard clause to reduce nesting
if (File.Exists(filePath)) return default(T);
try
{
var jsonStr = File.ReadAllText(filePath);
// guard clause again
if(string.IsNullOrWhiteSpace(jsonStr)) return default(T);
// we could use null coalesing operator ?? here
return JsonConvert.DeserializeObject<T>(jsonStr) ?? default(T);
}
catch (Exception ex)
{
// do proper exception handling
}
return default(T);
}
public static bool WriteObjAsJson(string filePath, object obj)
{
try
{
string jsonStr = JsonConvert.SerializeObject(obj);
File.WriteAllText(filePath, jsonStr);
}
catch (Exception ex)
{
// do proper handling
return false;
}
return true;
}
}
Also you should think about exception handling.
- if the method returns default(T), maybe reflect that in the method name
- if the method throws on failure, maybe wrap the exception into a custom one
- if the method fails, return
null
to signal that - use the Try* pattern, look at
bool int.TryParse(string,out int)
Also, in the DummyClass
you can use Auto-Implemented Properties public string Name { get; set; }
I find this is not a very useful class. There are three reasons for that:
- it tries to encapsulate
JsonConvert
by another class that does almost exactly the same without adding any value to it - it hides all exceptions so you won't know what went wrong and
- it does not provide any interface to specify
JsonSerializerSettings
which sooner rather then later you'll need to tune the de/serialization process
So what should you do instead? I suggest implementing each file type that you need to de/serialize separately. This means that you should create such classes as:
class MySpecialFile
{
public static MySpecialFile Load(string path) {}
public void Save(string path) {}
}
that may use JSON internally where you can setup the de/serialization process as required by each file. Generic solutions almost never work in such cases.
File.ReadAllText
andFile.WriteAllText
will help you get rid of the byte arrays. \$\endgroup\$File.ReadAllBytes
andFile.ReadAllText
could be memory-intensive for large object graphs. Streams are your friends! \$\endgroup\$