Acute glaucoma is one of the few emergencies in ophthalmology. It occurs when the drainage pathway in the eye blocks up suddenly. This makes the internal ocular pressure shoot up, and causes extreme eye pain and blurry vision (at least for most people).
about 7 years ago
I’m a klutz. Always have been. Probably always will be. I blame my clumsiness on the fact that I didn’t crawl. Apparently I was sitting around one day and toddling on two feet the next. Whatever the cause, it’s a well-tested fact that I’m not good on icy footpaths. Various parts of my anatomy have gotten up close and personal with frozen ground on many an occasion. Not usually an issue for a born-and-bred Australian, except when said Australian goes to visit her Canadian family during the northern winter. During one such visit, I found myself unceremoniously plopped onto slick ice while my two-year-old niece frolicked around me with sure-footed abandon. I thought, “There has to be an easier way.” As freezing water seeped through my jeans, providing a useful cold pack for my screaming coccyx, my memory was jogged. I recalled that a lateral-thinking group of New Zealand researchers had won the Ignoble Prize for Physics for demonstrating that wearing socks on the outsides of shoes reduces the incidence of falls on icy footpaths. To the amusement of my niece, I tried out the theory for myself on the walk home. I don’t know if I had a more secure foothold or not, but I did manage to get blisters from wearing sneakers without socks. I love socks. They cover my large, ungainly clod-hoppers and keep my toes toasty warm almost all year round. You know the song ‘You can leave your hat on.’? Well for me, it is more a case of ‘You can leave your socks on, especially in winter. There’s nothing unromantic about that… is there? I’m not, however, as attached to my socks as a patient I once treated. As an intern doing a psychiatry rotation, one of my tasks was to do physical examinations on all admissions. Being a dot-the-i’s kinda girl, when an old homeless man declined to remove his socks so that I could examine his feet, I didn’t let it slide. “I haven’t taken off my socks for thirty years,” he pronounced. “It can’t be that long. Your socks aren’t thirty years old. In fact, they look quite new,” I countered. “When the old ones wear out, I just slip a new pair over the top.” I didn’t believe him. From his odour, I would have believed that he hadn’t showered in thirty years, but the sock story didn’t add up. He eventually agreed to let me take them off. The top two sock layers weren’t a problem but then I ran into trouble. Black remains of what used to be socks clung firmly to his feet, and my gentle attempts at their removal resulted in screams of agony. I tried soaking his feet. Still no luck. His skin had grown up into the fibres, and it was impossible to extract the old sock remnants without ripping off skin. In retrospect I probably should have left the old man alone, but instead got the psych registrar to have a peek, who then involved the emergency registrar, who called the surgeon and soon enough the patient and his socks were off to theatre. The ‘surgical removal of socks’ was not a commonly performed procedure, and it provided much staff amusement. It wasn’t so funny for Mr. Sock Man, who required several skin grafts! From my perspective here in Canada, while I thoroughly commend the Kiwis for their ground-breaking sock research, I think I’ll stick to the more traditional socks-in-shoes approach, change my socks regularly and work a bit on my coordination skills. References: PHYSICS PRIZE: Lianne Parkin, Sheila Williams, and Patricia Priest of the University of Otago, New Zealand, for demonstrating that, on icy footpaths in wintertime, people slip and fall less often if they wear socks on the outside of their shoes. "Preventing Winter Falls: A Randomised Controlled Trial of a Novel Intervention," Lianne Parkin, Sheila Williams, and Patricia Priest, New Zealand Medical Journal. vol. 122, no, 1298, July 3, 2009, pp. 31-8. (This blog post has been adapted from a column first published in Australian Doctor http://www.australiandoctor.com.au/articles/58/0c06f058.asp) Dr Genevieve Yates is an Australian GP, medical educator, medico-legal presenter and writer. You can read more of her work at http://genevieveyates.com/
Dr Genevieve Yates
about 7 years ago
It was a Saturday, about tea-time in the quaint village of Athelstaneford, East Lothian. Mrs Alexandria Agutter sat in her cottage, enjoying the delights of the late-summer evening with a glass of gin and tonic. She listlessly sipped from the rather generous pick-me up, no doubt chewing over the happenings of the day. Blast! The taste was much too bitter to her liking. She stood up. And promptly crumpled to the floor in a dizzied heap. It had not been five minutes when a fiery pain gripped her parched throat and in her frenzied turn she watched the bleary room become draped in a gossamery silk. How Dame Agatha would approve. But this is no crime novel, on that fateful day, 24th August 1994, poor Mrs Agutter immortalised herself in the history books of forensic medicine; she was the victim of a revered toxin and a vintage one it was too. She had unwittingly imbibed a G&T laced with a classic poison of antiquity. A clue from the 21st century: do you recall the first Hunger Games film adaption? Those inviting purple-black berries or as Suzanne Collins coined them ‘Nightlock’; a portmanteau of hemlock and Deadly Nightshade. True to the laters’ real life appearance those onscreen fictional fruits played a recurring cameo role. Deadly Nightshade is a perennial shrub of the family Solanaceae and a relative of the humble potato (a member of the Solanus genus). It is a resident of our native woodland and may be found as far afield as Europe, Africa and Western Asia. The 18th century taxonomist, Carl Linnaeus gave the plant an intriguing name in his great Species Plantarum. The genus Atropa is aptly named after one of the three Greek Fates, Atropos. She is portrayed shearing the thread of a mortal’s life so determining the time and manner of its inevitable end. The Italian species name belladona (beautiful woman) refers to the striking mydriatic effect of the plant on the eye. The name pays homage to Pietro Andre Mattioli, a 16th century physician from Sienna, who was allegedly the first to describe the plant’s use among the Venetian glitterati - ladies of fashion favoured the seductive, doe-eyed look. Belladona is poisonous in its entirety. It was from the plant’s roots in 1831, the German apothecary Heinrich F. G. Mein isolated a white, odourless, crystalline powder: it was (surprise, surprise) atropine. Atropine is a chiral molecule. From its natural plant source it exists as a single stereoisomer L-atropine, which also happens to display a chiral potency 50-100 times that of its D-enantiomer. As with many other anaesthetic agents it is administered as a racemic mixture. How strange that atropine now sits among the anaesthetist’s armamentarium, its action as a competitive antimuscarinic to counter vagal stimulation belies its dark history. It was a favourite of Roman housewives seeking retribution against their less than faithful husbands and a staple of the witch’s potion cupboard. Little wonder how belladona became known as the Devil’s plant. Curiouser still it’s also the antidote for other poisons, most notably the organophosphates or nerve gases. On account of its non-selective antagonism, atropine produces a constellation of effects: the inhibition of salivary, lacrimal and sweat glands occurs at low doses; dry mouth and skin are early markers. Pyrexia is a central effect exacerbated by the inability to sweat. Flushing of the face due to skin vessel vasodilatation. Low parasympathetic tone causes a moderate sinus tachycardia. Vision is blurred as the eye becomes dilated, unresponsive to light and accommodation is impaired. Mental disorientation, agitation and ataxia give the impression of drunkedness or a delirium tremens like syndrome. Visual hallucinations, often of butterflies or silk blowing in the wind, are a late feature. It was then that Mr Agutter, seemingly untroubled by the sight of his wife’s problematic situation, proceeded to leave a message with the local practitioner. How fortunate they were to have the vigilant locum check the answering machine and come round to the Agutter’s lodge accompanied by an ambulance crew. The attending paramedic had the presence of mind to pour the remainder of Mrs Agutter’s beverage into a nearby jam jar, while Mr Agutter handed over what he suspected to be the offending ingredient: the bottle of Indian tonic water. As it soon transpired there were seven other casualties in the surrounding countryside of East Lothian – all involving an encounter with tonic water. In fact by some ironic twist of fate, two of the victims were the wife and son of Dr Geoffry Sharwood-Smith, a consultant aneasthetist. Obviously very familiar with the typical toxidrome of anticholinergic agents, he was quick to suspect atropine poisoning. Although for a man of his position with daily access to a sweetshop of drugs, it was not something to draw attention to. Through no small amount of cunning had the poisoner(s) devised the plan. It was elegant; atropine is very bitter. So much so that it can be detected at concentrations of 100 parts per million (0.001%). Those foolish enough to try the berries of belladonna during walks in the woods are often saved by the berry’s sour taste. They are soon spat out. But the quinine in the tonic water was a worthy disguise. The lethal dose for an adult is approximately 90-130mg, however atropine sensitivity is highy variable. In its salt form, atropine sulfate, it is many times more soluble: >100g can be dissolved in 100ml of water. So 1ml may contain roughly tenfold the lethal dose. There ensued a nationwide scare; 50 000 bottles of Safeway branded Indian tonic water were sacrificed. Only six bottles had been contaminated. They had all been purchased, tops unsealed, from the local Safeway in Hunter’s Tryst. Superficially this looked like the handiwork of a psychopath with a certain distaste for the supermarket brand, and amidst the media furore, it did have some verisimilitude: one of the local papers received a letter from 25 year old, Wayne Smith admitting himself as the sole perpetrator. The forensic scientist, Dr Howard Oakley analysed the contents of the bottles. They all contained a non-lethal dose, 11-74mg/litre of atropine except for the Agutter’s, it contained 103mg/litre. The jam jar holding Mrs Agutter’s drink bore even more sinister results, the atropine concentration was 292mg/L. It would appear Mrs Agutter had in some way outstayed her welcome. But she lived. A miscalculation on the part of the person who had added an extra seasoning of atropine to her drink. According to the numbers she would have had to swallow a can’s worth (330ml) to reach the lethal dose. Thankfully she had taken no more than 50mg. The spotlight suddenly fell on Dr Paul Agutter. He was a lecturer of biochemistry at the nearby University of Napier, which housed a research syndicate specialising in toxicology. CCTV footage had revealed his presence at the Safeway in Hunter’s Tryst and there was eye witness evidence of him having placed bottles onto the shelves. Atropine was also detected by the forensic investigators on a cassete case in his car. Within a matter of two weeks he would be arrested for the attempted murder of his wife. Despite the calculated scheme to delay emergency services and to pass the blame onto a non-existent mass poisoner, he had not accomplished the perfect murder. Was there a motive? Allegedly his best laid plans were for the sake of a mistress, a mature student from Napier. He served seven years of a twelve year sentence. Astonishingly, upon his release from Glenochil prison in 2002, he contacted his then former wife proclaiming his innocence and desire to rejoin her in their Scottish home. A proposition she was not very keen on. Dr Agutter was employed by Manchester University as a lecturer of philosophy and medical ethics. He is currently an associate editor of the online journal Theoretical Biology and Medical Modelling. We will never know the true modus operandi as Dr Agutter never confessed to the crime. Perhaps all this story can afford is weak recompense for the brave followers of the Dry January Campaign. Oddly these sort of incidents never appear in their motivational testimonials. Acknowledgements Emsley J. Molecules of Murder. 2008, Cambridge, RSC Publishing, p.46-67. Lee MR. Solanaceae IV: Atropa belladona, deadly nightshade. J R Coll Physicians Edinb. March 2007; 37: 77-84. Illustrator Edward Wong This blog post is a reproduction of an article published in the The Medical Student Newspaper January issue, 2014 http://www.themedicalstudent.co.uk/
about 7 years ago
Imagine a world where procrastination became a productive pastime… Procrastination, as it stands, is a core feature of the ‘human condition’ and most would argue that it is here to stay. However, what if we could hijack the time we spend playing Candy Crush saga and trick ourselves into contributing towards something tangible. Today, I wish to explore this possibility with you. The phrase ‘gamification’ is not a new or made up word (I promise) although I agree it does sound jarring and I certainly wouldn’t recommend trying to use it in a game of scrabble (yet). The phrase itself refers to the process of applying game thinking and game mechanics to non-game contexts to engage users in solving problems. For our purposes and for the purposes of this blog ‘problems’ will equate to promoting healthy living for our patients and maintaining our own medical education. For one reason or another, most people show addictive behaviour towards games especially when they incorporate persistent elements of progression, achievement and competition with others. The underlying psychology won’t be discussed here; call it escapism, call it procrastination, call it whatever you will. What I want you to realise is that every day millions of people spend hours tending to virtual farms and cyber families whilst competing vigorously with ‘online’ friends. If we can take the addictive aspects of these popular games and incorporate them in to the non-game contexts I indicated to above, we could potentially trick ourselves, and even perhaps our patients, into a better way of life. The first time I heard the phrase ‘gamification’ was only last year. I was in Paris attending the Doctors 2.0 conference listening to talks on how cutting edge technologies and the Internet had been (or were going to be) incorporated into healthcare. One example that stood out to me was a gaming app that intended to engage people with diabetes to record their blood sugars more regularly and also compete with themselves to achieve better sugar control. People who have the condition of Diabetes Mellitus are continuously reminded of their diet and their blood sugar levels. I am not diabetic myself, but it is not hard to realise that diet and sugar control is going to be an absolute nightmare for people with diabetes both from a practical and psychological standpoint. Cue the mySugr Compainion, an FDA approved mobile application that was created to incorporate the achievement and progression aspects of game design to help encourage people with diabetes to achieve better sugar control. The app was a novel concept that struck a chord with me due to its potential to appeal to the part in everyone’s brain that makes them sit down and play ‘just one more level’ of their favorite game or app. There are several other apps on the market that are games designed to encourage self testing of blood sugar levels in people with diabetes. There is even a paediatric example titled; “Monster Manor,” which was launched by the popular Sanofi UK (who previously released the FDA / CE approved iBGStar iPhone blood glucose monitor). So applying aspects of game design into disease management apps has anecdotally been shown to benefit young people with Diabetes. However, disease management is just one area where game-health apps have emerged. We are taught throughout medical school and beyond that disease prevention is obviously beneficial to both our patients and the health economy. Unsurprisingly, one of the best ways to prevent disease is to maintain health (either through exercise and / or healthy eating). A prominent example of an app that helps to engage users in exercising is ‘RunKeeper,’ a mobile app that enables people to track and publish their latest jog-around-the-park. The elements of game design are a little more subtle in this example but the ability to track your own progress and compete with others via social media share buttons certainly reminds me of similar features seen in most of today’s online games. Other examples of ‘healthy living apps’ are rife amongst the respective ‘app stores,’ and there seems to be ample opportunity for the appliance of gamification in this field. An example might be to incorporate aspects of game design into a smoking cessation app or weight loss helper. Perhaps the addictive quality of a well designed game-app could overpower the urge for confectionary or that ‘last cigarette’… The last area where I think ‘gamification’ could have a huge benefit is in (medical) education. Learning and revising are particularly susceptible to the rot of procrastination, so it goes without saying that many educational vendors have already attempted to incorporate fresh ways in which they can engage their users to put down the TV remote and pick up some knowledge for the exams. Meducation itself already has an area on its website entitled ‘Exam Room,’ where you can test yourself, track your progress and provide feedback on the questions you are given. I have always found this a far more addictive way to revise than sitting down with pen and paper to revise from a book. However, I feel there could be a far greater incorporation of game design in the field of medical education. Perhaps the absolute dream for like-minded gamers out there would be a super-gritty medical simulator that exposes you to common medical emergencies from the comfort of your own computer screen. I mean, my shiny new gaming console lets me pretend to be an elite solider deep behind enemy lines so why not let me pretend and practice to be a doctor too? You could even have feedback functionality to indicate where your management might have deviated from the optimum. Perhaps more sensibly, the potential also exists to build on the existing banks of online medical questions to incorporate further aspects of social media interaction, achievement unlocks and inter-player competition (because in case you hadn’t noticed, medics are a competitive breed). I have given a couple of very basic examples on how aspects of game design have emerged in recent health-related apps. I feel this phenomenon is in its infancy. The technology exists for so much more than the above, we just need to use our imagination… and learn how to code.
Dr. Luke Farmery
about 7 years ago
BOXING Day, 1.30am. “Are you the doctor on call?” I wrenched my reluctant brain from its REM state. “Yes.” “I’m worried about my wife. She’s 16 weeks pregnant and very gassy.” “Gassy?” “Burping and farting. Smells terrible! It’s keeping us both awake. I’m worried it could be serious.” By the time I ascertained that there were no sinister symptoms and that the likely culprit was the custard served with Christmas pudding (the patient was lactose intolerant), I was wide awake. My brain refused to power down for hours, as if out of spite for being so rudely aroused. I have a confession to make. When the Australian Federal Government announced that it was planning to abolish after-hours practice incentive payments, I was delighted. I know, I know, I should have been outraged along with the rest of you. After all, the RACGP predicted that after-hours care would be decimated if incentives were removed. Comparisons were made with the revamp of the UK system in 2004, which led to 90% of the profession opting out of after-hours work. Much as I sympathised, I was secretly rubbing my hands together with selfish glee. Surely this would mean that our semi-rural practice would stop doing all of our own on-call and free me from my after-hours responsibilities? I detest being on call. I loathe it with a passion completely out of proportion to the imposition it actually causes. I’m on call for the practice and our local hospital only once a week and the workload isn’t onerous. Middle-of-the-night calls aren’t all that frequent, but my sleep can be disturbed by their mere possibility, leaving me tired and cranky. If I’m forced suddenly into “brain on, work mode” by a phone call, I can kiss hours of precious slumber goodbye. I love to sleep, but, as with drawing and tennis, I’m not very good at it. I gaze with envy at those lucky devils who nap on public transport and fight malicious urges to disturb their peaceful repose. If I’m not supine, in a quiet, warm room, with loose-fitting clothing, a firm mattress and a pillow shaped just-so, I can forget any chance of sleep. Let’s just say I can relate to the Princess and the Pea story. I bet she wouldn’t have coped well with being phoned in the middle of the night either. If these nocturnal calls were all bona fide emergencies, I wouldn’t mind so much. It’s the crap that really riles me. I’ve received middle-of-the-night phone calls from patients who are constipated, patients with impacted cerumen (“Me ear’s blocked, Doc. I can’t sleep”) and patients with insomnia who want to know if it’s safe to take a second sedative. The call that took the on-call cake for me, though, was from a couple who woke me at 11.30 one night to settle an argument. “My husband says that bacteria are more dangerous than viruses but I reckon viruses are worse. After all, AIDS is a virus. Can you settle it for us so we can get some sleep? It would really help us out.” I kid you not. Genevieve Yates is an Australian GP, medical educator, medico-legal presenter and writer. You can read more of her work at http://genevieveyates.com
Dr Genevieve Yates
about 7 years ago
By Genevieve Yates One reason why I chose to do medicine was that I didn’t always trust doctors – another being access to an endless supply of jelly beans. My mistrust stemmed from my family’s unfortunate collection of medical misadventures: Grandpa’s misdiagnosed and ultimately fatal cryptococcal meningitis, my brother’s missed L4/L5 fracture, Dad’s iatrogenic brachial plexus injury and the stuffing-up of my radius and ulna fractures, to name a few. I had this naïve idea that my becoming a doctor would allow me to be more in charge of the health of myself and my family. When I discovered that doctors were actively discouraged from treating themselves, their loved ones and their mothers-in-law, and that a medical degree did not come with a lifetime supply of free jelly beans, I felt cheated. I got over the jelly bean disappointment quickly – after all, the allure of artificially coloured and flavoured gelatinous sugar lumps was far less strong at age 25 than it was at age 5 – but the Medical Board’s position regarding self-treatment took a lot longer to swallow. Over the years I’ve come to understand why guidelines exist regarding treating oneself and one’s family, as well as close colleagues, staff and friends. Lack of objectivity is not the only problem. Often these types of consults occur in informal settings and do not involve adequate history taking, examination or note-making. They can start innocently enough but have the potential to run into serious ethical and legal minefields. I’ve come to realise that, like having an affair with your boss or lending your unreliable friend thousands of dollars to buy a car, treating family, friends and staff is a pitfall best avoided. Although we’ve all heard that “A physician who heals himself has an idiot for a doctor and a fool for a patient”, large numbers of us still self-treat. I recently conducted a self-care session with about thirty very experienced GP supervisors whose average age was around fifty. When asked for a show of hands as to how many had his/her own doctor, about half the group confidently raised their hands. I then asked these to lower their hands if their nominated doctor was a spouse, parent, practice partner or themselves. At least half the hands went down. When asked if they’d seek medical attention if they were significantly unwell, several of the remainder said, “I don’t get sick,” and one said, “Of course I’d see a doctor – I’d look in the mirror.” Us girls are a bit more likely to seek medical assistance than the blokes (after all, it is pretty difficult to do your own PAP smear – believe me, I’ve tried), but neither gender group can be held up as a shining example of responsible, compliant patients. It seems very much a case of “Do as I say, not do as I do”. I wonder how much of this is due to the rigorous “breed ’em tough” campaigns we’ve been endured from the earliest days of our medical careers. I recall when one of my fellow interns asked to finish her DEM shift twenty minutes early so that she could go to the doctor. Her supervising senior registrar refused her request and told her, “Routine appointments need to be made outside shift hours. If you are sick enough to be off work, you should be here as a patient.” My friend explained that this was neither routine, nor a life-threatening emergency, but that she thought she had a urinary tract infection. She was instructed to cancel her appointment, dipstick her own urine, take some antibiotics out of the DEM supply cupboard and get back to work. “You’re a doctor now; get your priorities right and start acting like one” was the parting message. Through my work in medical education, I’ve had the opportunity to talk to several groups of junior doctors about self-care issues and the reasons for imposing boundaries on whom they treat, hopefully encouraging to them to establish good habits while they are young and impressionable. I try to practise what I preach: I see my doctor semi-regularly and have a I’d-like-to-help-you-but-I’m-not-in-a-position-to-do-so mantra down pat. I’ve used this speech many times to my advantage, such as when I’ve been asked to look at great-aunt Betty’s ulcerated toe at the family Christmas get-together, and to write a medical certificate and antibiotic script for a whingey boyfriend with a man-cold. The message is usually understood but the reasons behind it aren’t always so. My niece once announced knowledgably, “Doctors don’t treat family because it’s too hard to make them pay the proper fee.” This young lady wants to be a doctor when she grows up, but must have different reasons than I did at her age. She doesn’t even like jelly beans! Genevieve Yates is an Australian GP, medical educator, medico-legal presenter and writer. You can read more of her work at http://genevieveyates.com/
Dr Genevieve Yates
over 7 years ago
A powerpoint covering Emergency Presentations. A lot of this is from the Oxford Handbook of clinical medicine or clinical knowledge summaries. I figure this stuff is something we should be able to rattle it off for clinical finals. I must credit my slide on shock to DrCrunch. Visit his site (drcrunch.co.uk) and follow him on twitter/facebook/youtube. There's a youtube video where he actually talks through the pacman diagram. I felt it was a brilliant way of explaining shock so put it in there! All images are off google.
almost 8 years ago
Using the Mental Health Act (Scotland) 2003: In order to carry out an Emergency detention of a patient is FY2 the most junior position that is able to do this (in conjunction with a mental health officer)? i.e. could a FY1 detain someone?
almost 8 years ago
It's been a while since I've added any thoughts to this blog. In that time I have finished my Obs/Gynae placement, I have spent a week on labour ward, and done my first week of my 4th year surgical placement. All the while cramming in revision between various activities and general staying alive measures. This, I feel, is how most people who are sitting their final written exams are spending their time, so I don't feel so alone. I just want to bring to the attention one amazing incident that happened on my labour ward week. I was on a night shift, there wasn't a lot going on. Absolutely everyone was knackered, the registrar who'd been on nights for the past week was just chatting to me. I have never seen someone look so tired. The emergency alarm went off and a lady had a cord prolapse, which is an obstetric emergency with a high foetal mortality rate. Now I think it's amazing that the doctor went from nearly falling asleep to switched on 'surgical-mode' in an instant, successfully performed the C-section, delivering the baby in about a minute, then went back to being absolutely knackered and let the SHO close up the wound. It just really impressed me and I felt it was something worth sharing. Actually I was incredibly surprised that I enjoyed Obs/Gynae. Women's health was a placement I was dreading, it was my last major knowledge gap and I didn't have a clue what it was going to be like. If my tutor for the block does read this, thank you for all your help and getting me involved in everything. I would encourage other students who are going into it and feeling any level of apprehension to just throw yourselves into it and give 110% effort. It is a great placement for practicing transferable skills (this is important to remember, especially if you don't have any desire to go into it you CAN transfer and practice skills from elsewhere!) and getting heavily involved in patient care. Also I'd like to point out the Mother and Baby were fine :)
almost 8 years ago
I started medical school in 2007 wanting to 'making people better'. I stopped medical school in 2010 facing the reality of not being able to get better myself, being ill and later to be diagnosed with several long term health conditions. This post is about my transition from being a medical student, to the other side - being a patient. There are many things I wish I knew about long-term health conditions and patients when I was a medical student. I hope that through this post, current medical students can become aware of some of theses things and put them into practice as doctors themselves. I went to medical school because I wanted to help people and make them better. I admired doctors up on their pedestals for their knowledge and skills and expertise to 'fix things'. The hardest thing for me was accepting that doctors can't always make people better - they couldn't make me better. Holding doctors so highly meant it was very difficult for me to accept their limitations when it came to incurable long-term conditions and then to accept that as a patient I had capacity myself to help my conditions and situation. Having studied medicine at a very academic university, I had a very strict perception of knowledge. Knowledge was hard and fast medical facts that were taught in a formal setting. I worked all day and night learning the anatomical names for all the muscles in the eye, the cranial nerves and citric acid cycle, not to mention the pharmacology in second year. Being immersed in that academic scientific environment, I correlated expertise with PhDs and papers. It was a real challenge to realise that knowledge doesn't always have to be acquired through a formal educational but that it can be acquired through experience. Importantly, knowledge acquired through experience is equally valid! This means the knowledge my clinicians have developed through studying and working is as valid as my knowledge of my conditions, symptoms and triggers, developed through experiencing it day in day out. I used to feel cross about 'expert patients' - I have spent all these hours in a library learning the biochemistry and pharmacology and 'Joe Bloggs' walks in and knows it all! That wasn't the right attitude, and wasn't fair on patients. As an expert patient myself now, I have come to understood that we are experts through different means, and in different fields. My clinicians remain experts in the biological aspects on disease, but that's not the full picture. I am an expert in the psychological and social impact of my conditions. All aspects need to be taken into account if I am going to have holistic integrated care - the biopsychosocial model in practice - and that's where shared-decision making comes in. The other concept which is has been shattered since making the transition from medical student to patient is that of routine. In my first rotation, orthopaedics and rheumatology, I lost track within the first week of how many outpatient appointments I sat in on. I didn't really think anything of them - they are just another 15 minute slot of time filled with learning in a very busy day. As a patient, my perspective couldn't be more different. I have one appointment with my consultant a year, and spend weeks planning and preparing, then a month recovering emotionally. Earlier this year I wrote a whole post just about this - The Anatomy of an Appointment. Appointments are routine for you - they are not for us! The concept of routine applies to symptoms too. After my first relapse, I had an emergency appointment with my consultant, and presented with very blurred vision and almost total loss of movement in my hands. That very fact I had requested an urgent appointment suggest how worried I was. My consultants response in the appointment was "there is nothing alarming about your symptoms". I fully appreciate that my symptoms may not have meant I was going to drop dead there and then, and that in comparison to his patients in ICU, I was not as serious. But loosing vision and all use of ones hands at the age of 23 (or any age for that matter) is alarming in my books! I guess he was trying to reassure me, but it didn't come across like that! I have a Chiari malformation (in addition to Postural Orthostatic Tachycardia Syndrome and Elhers-Danlos Syndrome) and have been referred to a neurosurgeon to discuss the possibility of neurosurgery. It is stating the obvious to say that for a neurosurgeon, brain surgery is routine - it's their job! For me, the prospect of even being referred to a neurosurgeon was terrifying, before I even got to the stage of discussing the operation. It is not a routine experience at all! At the moment, surgery is not needed (phew!) but the initial experience of this contact with neurosurgeons illustrates the concept of routines and how much our perspectives differ. As someone with three quite rare and complex conditions, I am invariable met in A&E with comments like "you are so interesting!". I remember sitting in the hospital cafeteria at lunch as a student and literally feasting on the 'fascinating' cases we had seen on upstairs on the wards that morning. "oh you must go and see that really interesting patient with X, Y and Z!" I am so thankful that you all find medicine so interesting - you need that passion and fascination to help you with the ongoing learning and drive to be a doctor. I found it fascinating too! But I no longer find neurology that interesting - it is too close to home. Nothing is "interesting" if you live with it day in day out. No matter what funky things my autonomic nervous may be doing, there is nothing interesting or fascinating about temporary paralysis, headaches and the day to day grind of my symptoms. This post was inspired by NHS Change Day (13th March 2013) - as a patient, I wanted to share these few things with medical students, what I wish I knew when I was where you are now, to help the next generation of doctors become the very best doctors they can. I wish you all the very best for the rest of your studies, and thank you very much for reading! Anya de Iongh www.thepatientpatient2011.blogspot.co.uk @anyadei
Anya de Iongh
about 8 years ago
I'm an SHO, but I don't have your typical ward based job. In the last four years I have treated in jungles, underwater (in scuba gear), 5m from a gorilla, up a volcano, on a beach, at altitude, on safari, in a bog and on a boat. Expedition medicine is a great way to travel the world, take time out whist expanding your CV, and be physically and mentally challenged and develop your skill and knowledge base. As a doctor, you can undertake expeditions during your 'spare time' but it is more common for doctors to go on expeditions between F2 and specialty training. This is the ideal time either because you have been working for the last 7 years and either you need a break, the NHS has broken you, or you don't know what you want to do with your career and need time to think. At this point I would recommend using your F2 course/study budget on an Expedition Medicine course. They are expensive, but the knowledge and skill base you gain makes you more prepared and competitive for expedition jobs. There are many types of Expedition Medicine jobs ranging from endurance sports races to scientific expeditions. Although the jobs differ, there are many ailments common to all. You should expect to treat diarrhoea and vomiting, insect bites, blisters, cuts, injuries, and GP complaints such headaches and exacerbations of chronic illnesses. More serious injuries and illnesses can occur so it is good to be prepared as possible. To help, ensure your medical kit is labelled and organised e.g. labelled cannulation kit, emergency kit is always accessible and you are familiar with the casevac plan. Your role as an Expedition Medic involves more that the treatment of clients. A typical job also includes client selection and education, risk assessment, updating casevac plans, stock-checking kit, health promotion, project management and writing debriefs. What's Right For You? If you're keen to do Expedition Medicine, first think about where you want to go and then for how long. Think hard about these choices. A 6 month expedition through the jungle sounds exciting, but if you don't like spiders, creepy-crawlies and leaches, and the furthest you have travelled is an all-inclusive to Mallorca, then it might be best to start with a 4 week expedition in France. When you have an idea of what you want to do there are many organisations that you can apply to, including: Operation Wallacea Raleigh Across the Divide World Challenge Floating Doctors Doctors Without Borders Royal Geographical Society Action Challenge GapForce Each organisation will have different aims, clients, resources and responsibilities so pick one that suits you. Have fun and feel free to post any question below.
Dr Rachel Saunders
about 8 years ago