
The Los Angeles General Medical Center is calling for help in identifying a mystery patient.
Per LA Health Services, a 34-year-old John Doe was discovered around the Torrance area on July 31 with no formal indication of who he is.
He is said to have brown eyes and hair, measuring in at 5 feet 7 inches and weighing approximately 166lbs.
Because of patient confidentiality, the hospital cannot legally share details of the gentleman's condition or care plan, although a picture obtained by PEOPLE depicts him breathing through a ventilator in bed with pads covering his eyes.
Advert

Anybody who may know something about this man is urged to call Laura, the clinical social worker, between the hours of 8.30am and 5pm Monday to Friday.
Her number is 323-409-7779, while the social work department can be reached on 323-409-5253 too.
This comes as New York City's Mount Sinai Hospital continues to deal with its own Jane Doe case.
Advert
On April 12, a woman was found on a bus stop bench in the Harlem neighbourhood after a concerned bystander called 911.
"The woman is known to frequent the area around 125th Street and Lenox Avenue, and hospital staff think she may go by the name 'Pam,'" a news release noted. "She usually dresses in black and hides her face."
Hospitalised for more than 100 days now, she is described as Black and probably in her 50s, 5 feet 8 inches, 170lbs, greying black hair, with dark brown eyes.

Advert
In other news, a 60-year-old man recently discovered why artificial intelligence device ChatGPT is not there to acquire sound medical advice.
In a report published on Annals of Internal Medicine Clinical Cases, it was revealed how the chatbot's response to one guy's questions about diet improvement ended up landing him in hospital.
Looking to dispose of his table salt intake - he'd read about its adverse effects on human health - the man was falsely encouraged to swap the sodium chloride for sodium bromide by this technology.
Having purchased the bromide online and introduced it to his diet for three straight months, he was then hospitalised amid worries that his neighbour was trying to poison him, which led to a discovery for the doctors investigating him.
Advert
Within a day, his paranoia was through the roof and he began complaining of both audio and visual hallucinations.
"It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation," read a section of the ACP Journal.
"While it is a tool with much potential to provide a bridge between scientists and the nonacademic population, AI also carries the risk for promulgating decontextualized information.
"It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride."
Advert
Doctors treated the individual with fluids, antipsychotics and electrolytes after he attempted to escape the facility. He was then admitted to the inpatient psychiatry unit.
This was a toxic reaction known as bromism, which is triggered by overexposure to the compound bromine - commonly used for industrial cleaning.
ChatGPT developer OpenAI urges users not to approach the tech for health diagnoses, with the company’s Service Terms saying their 'services are not intended for use in the diagnosis or treatment of any health condition'.