Facebook has said it agrees with a report that found it had failed to prevent its platform being used to "incite offline violence" in Myanmar.
The independent report, commissioned by Facebook, said the platform had created an "enabling environment" for the proliferation of human rights abuse.
It comes after widespread violence against the Rohingya minority which the UN has said may amount to genocide.
The report said Facebook would have to "get it right" before 2020 elections.
Facebook has more than 18 million users in Myanmar. For many, the social media site is their main or only way of getting and sharing news.
The network said it had made progress in tackling its problems in Myanmar but that there was "more to do".
Last year, the Myanmar military launched a violent crackdown in Rakhine state after Rohingya militants carried out deadly attacks on police posts.
Thousands of people died and more than 700,000 Rohingya fled to neighbouring Bangladesh. There are also widespread allegations of human rights abuses, including arbitrary killing, rape and burning of land.
The Rohingya are seen as illegal migrants in Myanmar (also called Burma) and have been discriminated against by the government and public for decades.
The new report was commissioned after the UN accused Facebook of being "slow and ineffective" in its response to the spread of hatred online.
The 62-page independent report from non-profit organisation Business for Social Responsibility (BSR) found that the platform "has become a means for those seeking to spread hate and cause harm" in Myanmar.
"A minority of users are seeking to exploit Facebook as a platform to undermine democracy and incite offline violence."
It said Facebook should more strictly enforce its existing policies on hate speech, introduce a "stand-alone human rights policy" and better engage with authorities in Myanmar.
The report - which only briefly referenced the Rohingya specifically - also warned that the 2020 elections presented a serious risk of further human rights abuses and warned Facebook to prepare now for "multiple eventualities".
Facebook's product policy manager, Alex Warofka, said in a statement: "We agree that we can and should do more."
"We have invested heavily... to examine and address the abuse of Facebook in Myanmar."
He said Facebook was "looking into" setting up a human rights policy and was making it easier to report and remove violent or inciting content.
Mr Warofka said the company now employs Burmese language specialists to review potentially sensitive content.
Much of Myanmar communicates online using the Zawgyi font, which is not easily translated into English and therefore using it makes it harder for bad content to be detected.
Facebook has removed Zawgyi as a language option for new users and is supporting Myanmar's transition to Unicode - the international text encoding standard.
Facebook has already banned several Myanmar military and government figures it said had "inflamed ethnic and religious tensions".