Women's Fiction

What is Women’s fiction? Women’s fiction is a general term covering all women-centred books and that focus on women’s lives. This includes mainstream novels and women’s rights books. It is different from Women’s literature, which is literature that has been written by women and not promoted to them. English has no equivalent label for fiction written by men.

Showing the single result