Cultural change could be just what's needed All of us, at some time, will have experiences of being a patient. At such times we might feel vulnerable as we look to doctors, nurses and other healthcare professionals for help and advice. While most of our experiences will be positive, a significant minority of us will experience difficulties in our interactions with healthcare professionals. For example last year, following a spate of similar reports across the UK, the Older People's Commissioner for Wales found consistent issues concerning the lack of dignity and respect patients received in hospital. These situations can cause real distress for patients, undermine the effectiveness of clinical treatment and sometimes impacts on how fast we might recover. I am interested in how this state of affairs comes about within an NHS that promotes respect, dignity and compassion for all. My research examines what happens to healthcare students during their training in clinical settings that means they sometimes have to be reminded that the person in front of them is a human who deserves compassion and respect. Today's healthcare students are explicitly taught about what comprises professional values and behaviours. However, a large part of learning to become a healthcare professional occurs within the NHS as they observe their seniors – who act as powerful role models – interacting with patients. Sometimes these role models were trained many years ago and belong to a different culture of medicine with different ways of doing things. People who belong to the same cultural group tend to embrace common characteristics such as language, customs and values. In doing this they embrace a common "cultural identity" and achieve a sense of belonging. Likewise, healthcare students tend to embrace common characteristics of their chosen profession. They look to their seniors for guidance about how to behave. But what if their seniors belong to a different era where things that were acceptable then may no longer be acceptable now? One strand of my research examines professionalism dilemma situations. These are situations in which healthcare students find themselves witnessing or participating in something unethical or unprofessional. These include witnessing, and sometimes participating in, breaches of patient safety and dignity. Students often report experiencing distress in such situations as they know the right way to behave, but feel unable to do so for some reason. In their stories, students frequently report feeling unable to speak out for fear of receiving poor grades as their seniors are also their teachers, because they are low in the pecking order or because speaking out might hamper their future career. So how can we support tomorrow's healthcare students to become ethical and compassionate professionals? Revalidation for doctors is coming into force and involves patient and colleague feedback. But our research suggests that, by itself, this is insufficient to change behaviours. We urge healthcare schools to provide students with a safe place to share their stories with each other and with ethical role models so they can begin to make sense of their experiences, share good practice and ways to resist bad practice. Most of all, we suggest that cultural change should occur from within. Patients, patient advocates, students and healthcare professionals should join together to examine how language, practices and values occurring within clinical settings can be developed to improve patient safety and dignity for all.
Dr Lynn Monrouxe
over 9 years ago
It is understandable why resume writing is daunting for most students – they haven’t achieved many significant things at such young age and they have difficulties to present usual things as something extraordinary. However, you shouldn’t give up on your efforts, because you will be surprised by all things your potential employers consider valuable. All you have to do is find the right way to demonstrate your achievements and relate them to the job you are applying for. The following tips will help you write a great resume that will represent you as an ideal candidate for every employer. 1. Start the process by listing your experiences. You cannot tackle the challenge right where it gets most difficult, so you should gradually work your way towards the precise professional language. Start with brainstorming and create a list of all experiences you consider significant. You can draw experiences from all life aspects, such as school, academic activities, internships, prior employments, community service, sports, and whatever else you consider important. Look at that list and distinguish the most motivating experiences that led you to the point where you currently are. 2. Target the resume towards the job. Sending the same generic resume to all potential employers is a common mistake students do. You should tailor a custom-written resume for each job application, representing experiences and skills that will be relevant for the position you’re applying for. 3. Present yourself as a dynamic person. Find the most active components of your experiences and present them in the resume. Focus on action verbs, because they are attention-grabbing and make powerful statements (trained, evaluated, taught, researched, organized, led, oriented, calculated, interviewed, wrote, and so on). 4. Mark the most notable elements of your experiences and use them to start your descriptions. An employer couldn’t care less about the mundane aspects of college or internships, so feel free to leave them out and highlight your persona as a professional who would be a great choice for an employee. 5. Show what you can do for the organization. Employers are only looking for candidates who can contribute towards the growth of their companies, so make sure to portray yourself as someone who can accomplish great things in the role you are applying for. You can do this by reviewing your experiences and highlighting any success you achieved, no matter how small it is. 6. Don’t forget that your most important job at the moment is being a student. While you’re a student, that’s the most important aspect of your life and you should forget to mention that you are an engaged learner in your resume. Include the high GPA and the achievements in your major as important information in your resume. 7. Describe the most important academic projects. At this stage of life, you don’t have many professional experiences to brag about, but your academic projects can also be included in your resume because they show your collaborative, critical thinking, research, writing, and presentation skills. 8. Present yourself as a leader. If you were ever engaged as a leader in a project, make sure to include the information about recruiting and organizing your peers, as well as training, leading, and motivating them. 9. Include information about community service. If all students knew that employers appreciate community service as an activity that shows that the person has matured and cares for the society, they wouldn’t underestimate it so much. Make sure to include information about your activities as a volunteer – your potential employers will definitely appreciate it. 10. Review before you submit! Your resume will require some serious reviewing before you can send it safely to employers. This isn’t the place where you can allow spelling and grammatical errors to slip through. The best advice would be to hire a professional editor to bring this important document to perfection. One of the most important things to remember is that writing a great resume requires a lot of time and devotion. Make sure to follow the above-listed steps, and you will make the entire process less daunting.
over 8 years ago
In a recent article in the BMJ the author wonders about the reasons beyond the rising trend diagnosing Attention Deficit Hyperactivity Disorder (ADHD). The article attempts to infer reasons for this. One possible reason was that the diagnostic criteria especially DSM may seem for some to be more inclusive than ICD-10. The speculation may explain the rise of the diagnosis where DSM is used officially or have an influence. In a rather constructive way, an alternative to rushing to diagnosis is offered and discussed in some details. The tentative deduction that the Diagnostic Statistical Manual (DSM) may be one of the causes of rising diagnosis, due to raising the cut-off of age, and widening the inclusion criteria, as opposed to International Classification of Diseases, 10th revision (ICD-10), captured my attention. On reading the ICD-10 diagnostic criteria for research (DCR) and DSM-5 diagnostic criteria, I found them quite similar in most aspects, even the phraseology that starts with 'Often' in many diagnostic criteria, they seem to differ a bit in age. In a way both classification, are attempting to describe the disorder, however, it sounds as if someone is trying to explain a person's behaviour to you, however, this is not a substitute to direct clinical learning, and observing the behaviour, as if the missing sentence is 'when you see the person, it will be clearer'. El-Islam agrees with the notion that DSM-5 seems to be a bit more inclusive than ICD-10. A colleague of mine who is a child psychiatrist and she is doing her MSc. thesis in ADHD told me, that DSM-5 seems to be a substantial improvement as compared to its predecessor. The criteria - to her - though apparently are more inclusive, they are more descriptive with many examples, and she infers that this will payback in the reliability of the diagnosis. She hopes gene research can yield in biological tests for implicated genes and neurotransmitters in ADHD e.g. DRD4, DAT, gene 5,6,11 etc. One child psychiatrist, regretted the fact that misdiagnosis and under-diagnoses, deprive the patient from one of the most effective treatments in psychiatry. It is hoped the nearest forthcoming diagnostic classification (ICD-11), will address the issue of the diagnosis from a different perspective, or else converge with DSM-5 to provide coherence and a generalised newer standard of practice. The grading of ADHD into mild, moderate, and severe seem to blur the border between disorder and non-disorder, however, this quasi-dimensional approach seems realistic, it does not translate yet directly in differences in treatment approaches as with the case of mild, moderate, severe, and severe depression with psychotic symptoms, or intellectual disability. The author states that one counter argument could be that child psychiatrists are better at diagnosing the disorder. I wonder if this is a reflection of a rising trend of a disorder. If ADHD is compared to catatonia, it is generally agreed that catatonia is less diagnosed now, may be the epidemiology of ADHD is not artefact, and that we may need to look beyond the diagnosis to learn for example from environmental factors. Another issue is that there seems to be significant epidemiological differences in the rates of diagnosis across cultures. This may give rise to whether ADHD can be classified as a culture-bound syndrome, or whether it is influenced by culture like anorexia nervosa, or it may be just because of the raising awareness to such disorders. Historically, it is difficult to attempt to pinpoint what would be the closest predecessor to ADHD. For schizophrenia and mania, older terms may have included insanity, for depression it was probably melancholia, there are other terms that still reside in contemporary culture e.g. hypochondriasis, hysteria, paranoia etc. Though, it would be too simplistic to believe that what is meant by these terms was exactly what ancient cultures meant by them, but, they are not too far. ADHD seems to lack such historical underpinning. Crichton described a disorder he refers to as 'mental restlessness'. Still who is most often credited with the first description of ADHD, in his 1902 address to the Royal College of Physicians. Still describes a number of patients with problems in self-regulation or, as he then termed it, 'moral control' (De Zeeuw et al, 2011). The costs and the risks related to over-diagnosis, ring a warning bell, to enhance scrutiny in the diagnosis, due to subsequent stigma, costs, and lowered societal expectations. They all seem to stem from the consequences of the methodology of diagnosis. The article touches in an important part in the psychiatric diagnosis, and classifications, which is the subjective nature of disorders. The enormous effort done in DSM-5 & ICD-10 reflect the best available evidence, but in order to eliminate the subjective nature of illness, a biological test seems to be the only definitive answer, to ADHD in particular and psychiatry in general. Given that ADHD is an illness and that it is a homogeneous thing; developments in gene studies would seem to hold the key to understanding our current status of diagnosis. The suggested approach for using psychosocial interventions and then administering treatment after making sure that it is a must, seems quite reasonable. El-Islam, agrees that in ADHD caution prior to giving treatment is a recommended course of action. Another consultant child psychiatrist mentioned that one hour might not be enough to reach a comfortable diagnosis of ADHD. It may take up to 90 minutes, to become confident in a clinical diagnosis, in addition to commonly used rating scales. Though on the other hand, families and carers may hypothetically raise the issue of time urgency due to scholastic pressure. In a discussion with Dr Hend Badawy, a colleague child psychiatrist; she stated the following with regards to her own experience, and her opinion about the article. The following is written with her consent. 'ADHD is a clinically based diagnosis that has three core symptoms, inattention, hyperactivity and impulsivity in - at least - two settings. The risk of over-diagnosis in ADHD is one of the potentially problematic, however, the risk of over-diagnosis is not confined to ADHD, it can be present in other psychiatric diagnoses, as they rely on subjective experience of the patient and doctor's interviewing skills. In ADHD in particular the risk of under-diagnosis is even more problematic. An undiagnosed child who has ADHD may suffer various complications as moral stigma of 'lack of conduct' due to impuslivity and hyperactivity, poor scholastic achievement, potential alienation, ostracization and even exclusion by peer due to perceived 'difference', consequent feelings of low self esteem and potential revengeful attitude on the side of the child. An end result, would be development of substance use disorders, or involvement in dissocial behaviours. The answer to the problem of over-diagnosis/under-diagnosis can be helped by an initial step of raising public awareness of people about ADHD, including campaigns to families, carers, teachers and general practitioners. These campaigns would help people identify children with possible ADHD. The only risk is that child psychiatrists may be met with children who their parents believe they might have the disorder while they do not. In a way, raising awareness can serve as a sensitive laboratory investigation. The next step is that the child psychiatrist should scrutinise children carefully. The risk of over-diagnosis can be limited via routine using of checklists, to make sure that the practice is standardised and that every child was diagnosed properly according to the diagnostic criteria. The use of proper scales as Strengths and Difficulties Questionnaire (SDQ) in its two forms (for parents SDQ-P and for teachers SDQ-T) which enables the assessor to learn about the behaviour of the child in two different settings. Conner's scale can help give better understanding of the magnitude of the problem. Though some people may voice criticism as they are mainly filled out by parents and teachers, they are the best tools available at hands. Training on diagnosis, regular auditing and restricting doctors to a standard practice of ensuring that the child and carer have been interviewed thoroughly can help minimise the risk of over-diagnosis. The issue does not stop by diagnosis, follow-up can give a clue whether the child is improving on the management plan or not. The effects and side effects of treatments as methylphenidate should be monitored regularly, including regular measurement height and weight, paying attention to nausea, poor appetite, and even the rare side effects which are usually missed. More restrictions and supervision on the medication may have an indirect effect on enhancing the diagnostic assessment. To summarise, the public advocacy does not increase the risk of over-diagnosis, as asking about suicidal ideas does not increase its risk. The awareness may help people learn more and empower them and will lead to more acceptance of the diagnosed child in the community. Even the potential risk of having more case loads for doctors to assess for ADHD may help give more exposure of cases, and reaching more meaningful epidemiological finding. From my experience, it is quite unlikely to have marked over-representation of children who the families suspect ADHD without sufficient evidence. ADHD remains a clinical diagnosis, and it is unlikely that it will be replaced by a biological marker or an imaging test in the near future. After all, even if there will be objective diagnostic tests, without clinical diagnostic interviewing their value will be doubtful. It is ironic that the two most effective treatments in psychiatry methylphenidate and Electroconvulsive Therapy (ECT) are the two most controversial treatments. May be because both were used prior to having a full understanding of their mechanism of action, may be because, on the outset both seem unusual, electricity through the head, and a stimulant for hyperactive children. Authored by E. Sidhom, H. Badawy DISCLAIMER The original post is on The BMJ doc2doc website at http://doc2doc.bmj.com/blogs/clinicalblog/#plckblogpage=BlogPost&plckpostid=Blog%3A15d27772-5908-4452-9411-8eef67833d66Post%3Acb6e5828-8280-4989-9128-d41789ed76ee BMJ Article: (http://www.bmj.com/content/347/bmj.f6172). Bibliography Badawy, H., personal communication, 2013 El-Islam, M.F., personal communication, 2013 Thomas R, Mitchell GK, B.L., Attention-deficit/hyperactivity disorder: are we helping or harming?, British Medical Journal, 2013, Vol. 5(347) De Zeeuw P., Mandl R.C.W., Hulshoff-Pol H.E., et al., Decreased frontostriatal microstructural organization in ADHD. Human Brain Mapping. DOI: 10.1002/hbm.21335, 2011) Diagnostic Statistical Manual 5, American Psychiatric Association, 2013 Diagnostic Statistical Manual-IV, American Psychiatric Association, 1994 International Classification of Diseases, World Health Organization, 1992
Dr Emad Sidhom
over 8 years ago
This field of medicine requires much more physiological and pathophysiological knowledge than most people give it credit for. Psychiatric illness DO have physical manifestations of symptoms; in fact those symptoms help form the main criteria for differential diagnoses. For example, key physical symptoms of depression, besides having a low mood for more than two weeks (yes, two weeks is all it takes to be classified as 'depressed'), include fatigue, change in appetite, unexplained aches/pains, changes in menstrual cycle if you're a female, altered bowel habits, abnormal sleep, etc. Aside from this, studies suggest that psychiatric illnesses put you at higher risk for physical conditions including heart disease, osteoarthritis, etc. (the list really does go on) Although some mental health conditions, like cognitive impairments, still do not have very effective treatment options; most psychiatric medications work very well, and are necessary for treating the patient. The stigma surrounding them by the public causes a huge problem for doctors. Many patients are reluctant to comply with medications because they are not as widely accepted as the ones for non-mental health conditions. A psychiatrist holds a huge responsibility for patient education. It can be tough to teach your patients about their medication, when many of them refuse to belief there is anything wrong with them (this is also because of stigma). Contrary to my previous beliefs, psychiatrists DO NOT sit around talking about feelings all day. The stereotypical image of someone lying down on a couch talking about their thoughts/feelings while the doctor holds up ink blots, is done more in 'cognitive behavioural therapy.' While this is a vital healthcare service, it's not really what a psychiatrist does. Taking a psychiatric history is just like taking a regular, structured medical history; except you have to ask further questions about their personal history (their relationships, professional life, significant life events, etc), forensic history, substance misuse history (if applicable), and childhood/developmental history. Taking a psychiatric history for a new patient usually takes at least an hour. The interesting thing about about treating a psychiatric patient is that the best guidelines you have for making them healthy is their personality before the symptoms started (this is called 'pre-morbid personality'). This can be difficult to establish, and can often be an ambiguous goal for a doctor to reach. Of course, there is structure/protocol for each illness, but each patient will be unique. This is a challenge because personalities constantly evolve, healthy or not, and the human mind is perpetual. On top of this, whether mental or physical, a serious illness usually significally impacts a person's personality. Most psychiatric conditions, while being very treatable, will affect the patient will struggle with for their whole life. This leaves the psychiatrist with a large portion of the responsibility for the patient's quality of life and well-being; this can be vey rewarding and challenging. The state of a person's mind is a perpetual thing, choosing the right medication is not enough. Before I had done this rotation, I was quite sure that this was a field I was not interested in. I still don't know if it is something I would pursue, but I'm definitely more open-minded to it now! PS: It has also taught me some valuable life lessons; most of the patients I met were just ordinary people who were pushed a little too far by the unfortunate combination/sequence of circumstances in their life. Even the ones who have committed crimes or were capable of doing awful things.. It could happen to anyone, and just because I have been lucky enough to not experience the things those people have, does not mean I am a better person for not behaving the same way as them.
about 8 years ago
Hypo-politicosis = A behavioral condition where political thought and action is dangerously below an optimal range. Leading to the ostrich phenomenon of delusionary belief that there is nothing outside of medicine. In an age of ever great openness, communication and democratic rights, the population of the western world is disengaging with political ideology, political debate and political engagement. This disengagement is nowhere more prevalent than in the UK. The total membership of all the political parties are at the lowest since they were formed. There are less trade unionist today than a century ago. And most importantly the proportion of people that vote regularly is at an all-time low. Surely, this is a sign of a dysfunctional democracy? Can we truly call it a democracy if the state’s citizens have no interest or control over how the state is run? What worries me even more than this dire situation, is the lack of interest in politics from fellow medical students. If you were to sit in a bar in a medical school city, I am sure you would be able to hear groups of medical students unwinding over a pint and discussing some political issues. But those political issues almost always evolve around medicine, such as abortion laws, public health initiatives, doctor’s pay and the re-structuring of the NHS. This insular mind set worries me because there is more to life than medicine! And while so much of our lives may be taken up with the learning and practice of medicine, our lives will be affected by so much more, and before medical school we all had to take an interest in so much more just to get an interview. Do you remember having the time and inclination to take an interest in something that wasn’t medicine? Like reading history or poetry? This insular mindset is detrimental because it means that as a demographic group we may not engage as fully as we should do with the rest of society, this could be bad for us but more importantly bad for the greater society. If medics become too disengaged in the greater political debates then we may find that society decides that doctors are easy targets and easy scapegoats. We may find our working lives extended, our social lives curtailed, our pensions decimated and our earning power diminished because we did not engage with the public and discuss these issues openly. We may also lose influence with the government if medics don’t vote for their local MPs, question their local party officials and fight our corner over important issues via the BMA. The other side of this coin is that medics are selected from some of the brightest in the country, educated at great expense by the state, trained and employed by the state and pay a huge amount of tax to the state. If we engage in politics less then society as a whole may suffer from a lack of highly intelligent, highly educated individuals, who should hopefully have a strong social conscience and interest in well run state, from putting their thinking skills to good use on societal problems. Dr Liam Fox is a conservative back-bench MP and use to be in the shadow cabinet. He has used the skills he developed as a doctor to try and follow an evidence based political career. He recently released a book called “Rising tides” (http://www.amazon.co.uk/Rising-Tides-Facing-Challenges-New-ebook/dp/B00CUE0DKQ) which analyses many of the world’s current political issues and I would highly recommend as many people as possible read it. I also hope that in future I can walk into a bar, meet some medical colleagues and talk about an issue that affects more of society than just medics! How about using a scientific approach to discuss how Britain’s education system could be improved? Or how Britain could use its welfare resources better to decreased homelessness (which would also reduce a burden on A and E’s)?
over 8 years ago
In the initial interviews with patients who suffer psychotic symptoms, it might be striking that the usage of terminology of descriptive psychopathology lingers on an arbitration of knowledge of 'truth' by using terms like delusions or hallucinations with their definition as false beliefs or false perceptions (Casey & Kelly 2007). These terms can cause annihilation of value to patient's experience, which may pose an initial strain on the egalitarian patient-doctor relationship. In an era, where deference to experts is dead, it might be worthy on agreeing on the effect of these experiences prior to lablelling them. Delusions can not be objectively detected and described, because it evolves and exists within subjective and interpersonal dimensions. Severe psycopathological symptoms share the fact that they are statistically deviant, and thus can be labeled as 'unshared'. Symptoms may be perceived as 'distressing' and they might be 'disabling' to them. The outcome behaviour which may raise concern can be a 'dysfunctional' behaviour (Adams & Sutker 2004). Jaspers considered the lack of understandability of how the patient reached conclusion to be the defining factor of a delusional idea. The notion of defining 'delusion' as false belief was challenged by Jaspers. Sims gives the example of a man who believed his wife was unfaithful to him because the fifth lamp-post alone on the left was unlit. What makes it a delusion is the methodology not the conclusion which may be right (Sims 1991). Some delusions might be mundane in their content, others may not be falsifiable. Dereistic thinking is not based on logic but rather on feelings. It is possible to find ways to evade falsification; an ad hoc hypotheses may also be part of the presentation. Fish stated that delusional elaboration may follow delusion and/or hallucination which may have convergence with the concept of the ad hoc hypothesis. Absence of verification from the patient's side does not lead to deductive falsification (Casey & Kelly 2007). Otherwise, the doctor-patient relationship carry the risk to transform to detective-suspect relationship, where the latter may perceive the need to present evidence of innocence. Mental health professionals are usually encountered by people who suffer to various degrees or make others suffer, and not because of various degrees of conviction. The primary role of the therapist is to be defined as some one who tries to alleviate the sufferings of others rather than correcting their beliefs. Communicating with patients in terms of how functional is their belief rather than it's truth may prove to be more egalitarian and clinically tuned. This may provide some middle ground in communication, without having to put an effort on defining the differences between what is 'true' and what is 'real'. The criterion for demarcation between what is real and what is pathologic may be different in the patient-doctor relationship. The assertion on the clinician's part on the falsity of a belief or experience can have the risk of dogmatism. The statistical deviance of symptoms, their distressing nature, disabling consequences, the resultant dysfunctional behaviour and apparent leap from evidence to conclusion may be a more agreeable surrogate starting points. This might be more in line with essence of medicine or 'ars medicina' (art of healing). Concordance with patients on their suffering may serve as an egalitarian platform prior to naming the symptoms. The term delusion commonly identified as false fixed belief, when used by a psychiatrist, it does not address only a symptom. It rather puts the interviewer in the position of an all knowing judge. After all, a service-user may argue that how come a doctor who never encountered or experienced any of the service-user's aspects of the problem as being persecuted at work and home, as plainly false. Then, does the psychiatrist know the truth. From a service-user point of view what he/she experience is real; which might not necessarily be true. The same applies for people who lead an average life, people who go to work bearing with them their superstitions, beliefs about ghosts, luck, horoscopes, zodiacs, or various revered beliefs. This term has the risk of creating a temporary crack in the mutual sense of equality between the therapist and the service-user. This may be due to the labelling of certain dysfunctional belief as unreal by one side. It has the potential for a subtle change in the relationship to the mental health professional placing himself/herself in the omniscient position and it contrasts with the essence of medical practice where practitioners assume the truth in what the patients say as in the rest of subjective symptoms as headache for example. The subsequent sequel of this is other labels such as 'bizarre delusions' or 'systematised delusions', further add to the deviation of the role of the professional therapist to an investigator in the domain of 'Truth' and architecture of 'Truth'. Furthermore, it might be strenuous to the relationship when the therapist - based on skeptic enquiry - starts explaining such symptoms. For example, if the service-user believes that Martians have abducted him, implanted a device in his brain and sent him/her back to earth, and the response communicated back is the 'delusional'. It could be argued by the service-user that the therapist who had not seen a Martian or a brain device before, labelled the whole story as 'delusion' in a rather perceived dismissive labelling with no intention to check on the existence of Martians or the device. In other words, the healer became the arbiter of truth, where both lack evidence for or against the whole thing; one member in the relationship stepped into power on basis of subjective view of plausibility or lack of thereof. In the case of hallucinations, the clinician labelling the patient's experience as hallucinations can be imposing fundamental dilemma for the patient. For example, if a patient hears a voice that says that everything is unreal apart from the voice, and the clinician says that the voice is the thing that is unreal. Both do not give evidence to their 'truth' apart from their statement. The clinician's existence to the patient's subjective reality is distorted by the multiple realities of the patient, and arguing on basis of mere existence that the 'voice' is the one that is 'false', does not give the patient a clue of the future methodology to discern from both, since percetption is deceived and/or distorted. In this case, another tool of the mind can be employed to address the patient. The same can be applied to a concept like 'over valued ideas', where the clinician decides that this particular idea is 'over valued', or that this 'idea' is 'over valued' in a pathological way. The value put on these ideas or not the patient values but the clinician's evaulation of 'value' and 'pathology'. The cut of point of 'value' and 'over value' seems to be subjective from the clinician's perspective. Also, 'derailment' pauses the notion of expecting a certain direction of talk. The concepts of 'grooming' and 'eye contact' implicitly entail the reference to a socio-cultural normative values. Thus, deviation from the normative value is reflected to the patient as pathology, which is an ambiguous definition, in comparison to the clarity of pathology. The usage of terms like 'dysfunctional unshared belief' or 'distressing auditory perception' or other related terms that address the secondary effect of a pathologic experience may be helpful to engage with the patient, and may be more logically plausible and philosophically coherent yet require empirical validation of beneficence. Taylor and Vaidya mention that it is often helpful to normalise, but this is not to minimise or be dismissive of patient's delusional beliefs.(Taylor & Vaidya 2009). The concept can be extended to cover other terms such as 'autistic thinking, 'apathy', 'blunting of affect', 'poor grooming', 'over-valued ideas', other terms can be applied to communicate these terms with service-users with minimal deviation from the therapeutic relationship. The limitation of these terms in communication of psychopathology are special circumstances as folie a deux, where a dysfunctional belief seems to be shared with others Also, symptoms such as Charles-Bonnet syndrome; usually does not have negative consequences. The proposed terms are not intended for use as a replacement to well carved descriptive psychopathological terms. Terms like 'delusion' or 'hallucination' are of value in teaching psychopathology. However in practice, meaningful egalitarian communication may require some skill in selecting suitable terms that is more than simplifying jargon. They also may carry the burden of having to add to the psychiatric terminology with subsequent effort in learning them. They can also be viewed as 'euphemism' or 'tautology'. However, this has been the case from 'hysteria' to 'medically unexplained symptoms' which seems to match with the zeitgeist of an era where 'Evidence Based Medicine' is its mantra; regardless advances in treatment. Accuracy of terminology might be necessary to match with essence of scientific enquiry; systematic observation and accurate taxonomy. The author does not expect that such proposal would be an easy answer to difficulties in communication during practice. This article may open a discussion on the most effective and appropriate terms that can be used while communicating with patients. Also, it might be more in-line with an egalitarian approach to seek to the opinion of service-users and professional bodies that represent the opinions of service-users. Empirical validation and subjection of the concept to testing is necessary. Patient's care should not be based on logic alone but rather on evidence. Despite the limitations of such proposal with regards to completeness, it's hoped that the introduction of any term may help to add to the main purpose of any classification or labelling that is accurate egalitarian communication. DISCLAIMER This blog is adapted from BMJ doc2doc clinical blogs Philosophical Streamlining of Psychopathology and its Clinical Implications http://doc2doc.bmj.com/blogs/clinicalblog/_philosophical-streamlining-of-psychopathology-its-clinical-implications The blog is based on an article named 'Towards a More Egalitarian Approach to Communicating Psychopathology' which is published in the Journal of Ethics in Mental Health, 2013 http://www.jemh.ca/issues/v8/documents/JEMHVol8Insight_TowardsaMoreEgalitarianApproachtoCommunicatingPsychopathology.pdf Bibliography Adams, H. E., Sutker P.B. (2004). Comprehensive Handbook of Psychopathology. New York: Springer Science Casey, P., Kelly B., (2007). Fish's Clinical Psychopathology: Signs and Symptoms in Psychiatry, Glasgow: Bell & Bain Limited Kingdon and Turkington (2002), The case study guide to congitive behavior therapy for psychosis, Wiley Kiran C. and Chaudhury S. (2009). Understanding delusion, Indian Journal of Psychiatry Maddux and Winstead (2005). Psychopathology foundations for a contemporary understanding, Lawrence Erlbaum Associates Inc. Popper (2005) The logic of scientific discovery, Routledge, United Kingdom Sidhom, E. (2013) Towards a More Egalitarian Approach to Communicating Psychopathology, JEMH · 2013· 8 | 1 © 2013 Journal of Ethics in Mental Health (ISSN: 1916-2405) Sims A., Symptoms in the mind, (1991) an introduction to psychopathology, Baillere Tindall Taylor and Vaidya (2009), Descriptive psychopathology, the signs and symptoms of behavioral disorders, Cambridge university press
Dr Emad Sidhom
over 8 years ago
Through different periods of the Egyptian history from Pharaonic, Greco-Roman, Coptic, Islamic and Modern Era; Egyptians tend to respect, appreciate and care for elderly. There is also a rich Eastern Christian tradition in respecting and taking care of old people that has continued since the first centuries of Christianity. Churches used to develop retirement homes served by monastic personnel and nurses. Egyptian culture traditionally linked some aspects of mental illnesses to sin, possession of evil, separation from the divine and it is usually associated with stigmatisation for all family members. However, forgetfulness with ageing was normalised. Until now, it seems that the difference between normal ageing and dementia is blurred for some people. Recently, the term 'Alzheimer' became popular, and some people use it as synonymous to forgetfulness. El-Islam, stated that some people erroneously pronounce it as 'Zeheimer' removing the 'Al' assuming it is the Arabic equivalent to the English 'the'. In 2010, a film was produced with the title 'Zeheimer' confirming the mispronunciation. Elderly face many health challenges which affect their quality of life. Dementia is one of these challenges as it is considered to be one of the disorders which attack elderly and affect their memory, mental abilities, independence, decision making and most cognitive functions. Therefore, the focus on dementia has increased around the world due to the rapid spread of the syndrome and the economical and psychosocial burden it cause for patients, families and communities. (Grossber and Kamat 2011, Alzheimer’s Association 2009, Woods et al. 2009). In recent years, the proportion of older people is increasing due to the improvement in health care and scientific development. The demographic transition with ageing of the population is a global phenomenon which may demand international, national, regional and local action. In Egypt the ageing population at the age of 65 and older are less than 5% of the Egyptian population (The World FactBook, 2012), yet, the World Health Organization (WHO) asserts that a demographic shift is going to happen as most of the rapid ageing population will transfer to the low and middle income countries in the near future (WHO, 2012). Egyptian statistics assert this shift. The Information Decision Support Center published the first comprehensive study of the elderly in Egypt in 2008. According to the report, in 1986, 5 percent of Egyptians were age 60 and older. In 2015, they will make up to 11 percent of the population and in 2050; over a fifth. Caring of older persons constitutes an increasing segment of the Egyptian labor market. However, nation wide statistics about number of dementia sufferers in Egypt may be unavailable but the previous demographic transition is expected to be accompanied by an increase in dementia patients in Egypt and will affect priorities of health care needs as well. The Egyptian society may need adequate preparation with regards to health insurance, accommodation and care homes for the upcoming ageing population (El-Katatney, 2009). Although the number of care home increased from 29 in 1986 to be around 140 home in 2009; it cannot serve more than 4000 elderly from a total of 5 million. Not every elderly will need a care home but the total numbers of homes around Egypt are serving less than 1% of the elderly population. These facts created a new situation of needs for care homes besides the older people who are requiring non-hospital health care facility for assisted living. The Egyptian traditions used to be strongly associated with the culture of extended family and caring for elderly as a family responsibility. Yet, in recent years changes of the economic conditions and factors as internal and external immigration may have affected negatively on elderly care within family boundaries. There is still the stigma of sending elderly to care homes. Some perceive it as a sign of intolerance of siblings towards their elderly parents but it is generally more accepted nowadays. Therefore, the need for care homes become a demand at this time in Egypt as a replacement of the traditional extended family when many older people nowadays either do not have the choice or the facilities to continue living with their families (El-Katatney 2009). Many families among the Egyptian society seem to have turned from holding back from the idea of transferring to a care home to gradual acceptance since elderly care homes are becoming more accepted than the past and constitutes a new concept of elderly care. Currently, many are thinking to run away from a lonely empty home in search of human company or respite care but numbers of geriatric homes are extremely lower than required and much more are still needed (Abdennour, 2010). Thus, it seems that more care homes may be needed in Egypt. Dementia patients are usually over 65, this is one of the factors that put them at high risk of exposure to different physical conditions related to frailty, old age, and altered cognitive functions. Additionally, around 50% of people with dementia suffers from other comorbidities which affect their health and increases hospital admissions (National Audit Office 2007). Therefore, it is expected that the possibility of doctors and nurses needing to provide care for dementia patients in various care settings is increasing (RCN 2010). Considering previous facts, we have an urgent need in Egypt to start awareness about normal and upnormal ageing and what is the meaning of dementia. Moreover, change of health policies and development of health services is required to be developed to match community needs. Another challenge is the very low number of psychiatric doctors and facilities since the current state of mental health can summarised as; one psychiatrist for every 67000 citizens and one psychiatric hospital bed for every 7000 citizens (Okasha, 2001). Finally the need to develop gerontologically informed assessment tools for dementia screening to be applied particularly in general hospitals (Armstrong and Mitchell 2008) would be very helpful for detecting dementia patients and develop better communication and planning of care for elderly. References: El Katateny, E. 2009. Same old, same old: In 2050, a fifth of Egyptians will be age 60 and older. How will the country accommodate its aging population?. Online available at: http://etharelkatatney.wordpress.com/category/egypt-today/page/3/ Fakhr-El Islam, M. 2008. Arab culture and mental health care. Transcultural Psychiatry, vol. 45, pp. 671-682 Ageing and care of the elderly. Conference of European churches. 2007. [online] available at: http://csc.ceceurope.org/fileadmin/filer/csc/Ethics_Biotechnology/AgeingandCareElderly.pdf World Health Organization. 2012 a. Ageing and life course: ageing Publications. [Online] available at : http://www.who.int/ageing/publications/en/ World Health Organization. 2012 b. Ageing and life course: interesting facts about ageing. [Online] available at: http://www.who.int/ageing/about/facts/en/index.html World Health Organization 2012 c. Dementia a public health priority. [online] available at: http://whqlibdoc.who.int/publications/2012/9789241564458_eng.pdf World Health Organization. 2012 d. Why focus on ageing and health, now?. Department of Health. 2009. Living well with dementia: a national dementia strategy. [Online] available at: http://www.dh.gov.uk/en/Publicationsandstatistics/Publications/PublicationsPolicyAndGuidance/DH_094058 Andrawes, G., O’Brien, L. and Wilkes, L. 2007. Mental illness and Egyptian families. International Journal of Mental Health Nursing, vol.16, pp. 178-187 National Audit Office. 2007. Improving service and support for people with dementia. London. [online[ Available at: http://www.nao.org.uk/publications/0607/support_for_people_with_dement.aspx Armstrong, J and Mitchell, E. 2008. Comprehensive nursing assessment in the care of older people. Nursing Older People, vol. 20, No. 1, pp. 36-40. Okasha, A. 2001. Egyptian contribution to the concept of mental health. Eastern Mediterranean Health Journal,Vol. 7, no. 3, pp. 377-380. Woods, R., Bruce, E., Edwards, R., Hounsome, B., Keady, J., Moniz-Cook, E., Orrell, M. and Tussell, I. 2009. Reminiscence groups for people with dementia and their family carers: pragmatic eight-centre randomised trial of joint reminiscence and maintenance versus usual treatment: a protocol. Trials Journal: open access, Vol. 10, [online] available at: http://www.trialsjournal.com/content/10/1/64 Grossberg, G. and Kamat, S. 2011. Alzheimer’s: the latest assessment and treatment strategies. Jones and Bartlett, publisher: The United States of America. Alzheimer’s Association. 2009. 2009 Alzheimer’s disease facts and figures. Alzheimer’s & Dementia, Volume 5, Issue 3. [online] Available at: http://www.alz.org/news_and_events_2009_facts_figures.asp Royal College of Nursing. 2010. Improving quality of care for people with dementia in general hospitals. London. National Audit Office. 2007. Improving service and support for people with dementia. London. [online[ Available at: http://www.nao.org.uk/publications/0607/support_for_people_with_dement.aspx Authors: Miss Amira El Baqary, Nursing Clinical instructor, The British University in Egypt email@example.com Dr Emad Sidhom, MBBCh, ABPsych-Specialist in Old Age Psychiatry-Behman Hospital firstname.lastname@example.org
Amira El Baqary
about 8 years ago
So you're sitting in a bus when you see a baby smile sunnily and gurgle at his mother. Your automatic response? You smile too. You're jogging in the park, when you see a guy trip over his shoelaces and fall while running. Your knee jerk reaction? You wince. Even though you're completely fine and unscathed yourself. Or, to give a more dramatic example; you're watching Titanic for the umpteenth time and as you witness Jack and Rose's final moments together, you automatically reach for a tissue and wipe your tears in whole hearted sympathy ( and maybe blow your nose loudly, if you're an unattractive crier like yours truly). And here the question arises- why? Why do we experience the above mentioned responses to situations that have nothing to do with us directly? As mere passive observers, what makes us respond at gut level to someone else's happiness or pain, delight or excitement, disgust or fear? In other words, where is this instinctive response to other people's feelings and actions that we call empathy coming from? Science believes it may have discovered the answer- mirror neurons. In the early 1990s, a group of scientists (I won't bore you with the details of who, when and where) were performing experiments on a bunch of macaque monkeys, using electrodes attached to their brains. Quite by accident, it was discovered that when the monkey saw a scientist holding up a peanut, it fired off the same motor neurons in its brain that would fire when the monkey held up a peanut itself. And that wasn't all. Interestingly, they also found that these motor neurons were very specific in their actions. A mirror neuron that fired when the monkey grasped a peanut would also fire only when the experimenter grasped a peanut, while a neuron that fired when the monkey put a peanut in its mouth would also fire only when the experimenter put a peanut in his own mouth. These motor neurons came to be dubbed as 'mirror neurons'. It was a small leap from monkeys to humans. And with the discovery of a similar, if not identical mirror neuron system in humans, the studies, hypotheses and theories continue to build. The strange thing is that mirror neurons seem specially designed to respond to actions with clear goals- whether these actions reach us through sight, sound, smell etc, it doesn't matter. A quick example- the same mirror neurons will fire when we hop on one leg, see someone hopping, hear someone hopping or hear or read the word 'hop'. But they will NOT respond to meaningless gestures, random or pointless sounds etc. Instead they may well be understanding the intentions behind the related action. This has led to a very important hypothesis- the 'action understanding' ability of mirror neurons. Before the discovery of mirror neurons, scientists believed our ability to understand each other, to interpret and respond to another's feeling or actions was the result of a logical thought process and deduction. However, if this 'action understanding' hypothesis is proved right, then it would mean that we respond to each other by feeling, instead of thinking. For instance, if someone smiles at you, it automatically fires up your mirror neurons for smiling. They 'understand the action' and induce the same sensation within you that is associated with smiling. You don't have to think about what the other person intends by this gesture. Your smile flows thoughtlessly and effortlessly in return. Which brings us to yet another important curve- if mirror neurons are helping us to decode facial expressions and actions, then it stands to reason that those gifted people who are better at such complex social interpretations must be having a more active mirror neuron system.(Imagine your mom's strained smile coupled with the glint in her eye after you've just thrown a temper tantrum in front of a roomful of people...it promises dire retribution my friends. Trust me.) Then does this mean that people suffering from disorders such as autism (where social interactions are difficult) have a dysfunctional or less than perfect mirror neuron system in some way? Some scientists believe it to be so. They call it the 'broken mirror hypothesis', where they claim that malfunctioning mirror neurons may be responsible for an autistic individual's inability to understand the intention behind other people's gestures or expressions. Such people may be able to correctly identify an emotion on someone's face, but they wouldn't understand it's significance. From observing other people, they don't know what it feels like to be sad, angry, surprised or scared. However, the jury is still out on this one folks. The broken mirror hypothesis has been questioned by others who are still skeptical about the very existence of these wonder neurons, or just how it is that these neurons alone suffered such a developmental hit when the rest of the autistic brain is working just dandy? Other scientists argue that while mirror neurons may help your brain to understand a concept, they may not necessarily ENCODE that concept. For instance, babies understand the meaning behind many actions without having the motor ability to perform them. If this is true, then an autistic person's mirror neurons are perfectly fine...they were just never responsible for his lack of empathy in the first place. Slightly confused? Curious to find out more about these wunderkinds of the human brain? Join the club. Whether you're an passionate believer in these little fellas with their seemingly magical properties or still skeptical, let me add to your growing interest with one parting shot- since imitation appears to be the primary function of mirror neurons, they might well be partly responsible for our cultural evolution! How, you ask? Well, since culture is passed down from one generation to another through sharing, observation followed by imitation, these neurons are at the forefront of our lifelong learning from those around us. Research has found that mirror neurons kick in at birth, with infants just a few minutes old sticking their tongues out at adults doing the same thing. So do these mirror neurons embody our humanity? Are they responsible for our ability to put ourselves in another person's shoes, to empathize and communicate our fellow human beings? That has yet to be determined. But after decades of research, one thing is for sure-these strange cells haven't yet ceased to amaze and we definitely haven't seen the last of them. To quote Alice in Wonderland, the tale keeps getting "curiouser and curiouser"!
about 8 years ago
Good morning all, Being new to blogging, it's surprisingly interesting how difficult it is to start! I recently read Atul Gawande's three best selling books and they were an inspiration. I am sure most medic's will be aware of Mr Gawande (http://gawande.com/), the man behind the WHO safe surgery checklist. If you are not, and you want to read something that will really enthuse you about modern medicine, then please do get his books out from the library. I would recommend starting with "Better". The last chapter of "Better" is what prompted me to write this. Gawande has come up with 5 principles for being a "positive deviant" and 1 of them is - Just Write! He believes that to make our lives as doctors/medical students and the world a better place, we should all write down what we have been thinking about, because we may just come up with something that other people can use or just find others who have similar thoughts and will help us build a sense of community together. Although I have made many previous New Years resolutions to start keeping diaries and to keep journals of thoughts. They have always ended fairly quickly. This time may be different. Hopefully I will come up with some more thoughts that are vaguely worth sharing soon. Final thought for now - "Gawande-ism" = the belief that we can all make self-improvements and improve the world around us, little by little.
about 9 years ago
The book of the week this week has been Chris Patten’s “Not quite the diplomat” – part autobiography, half recent history and a third political philosophy text. It is a fascinating insight into the international community of the last 3 decades. The book has really challenged some of my political beliefs – which I thought were pretty unshakeable – and one above all others, the EU. I read this book to help me decide who I should vote for in the upcoming MEP elections. I have to make a confession, my political views are on the right of the centre and I have always been quite a strong “Eurosceptic”. Although recently, I have found myself drifting further and further into the camp of “we must pull out of Europe at all costs” but Mr Patten’s arguments and insights have definitely made me question this stance. With the European Parliamentary elections coming up, I thought it might be an interesting time to put some ideas out there for discussion. From a young age, I have always been of the opinion that Great Britain is a world leading country, a still great power, one of the best countries in the world - democratic, tolerant, fair, sensible - and that we don’t need anyone else’s “help” or interference in how our country is run. I believe that British voters should have a democratic input on the rules that govern them. To borrow an American phrase “No taxation without representation!” I believe that democracy is not perfect but that it is the best system of government that humans have been able to develop. For all of its faults, voters normally swing back to the centre ground eventually and any silly policies can be undone. This system has inherently more checks and balances than any meritocracy, oligarchy or bureaucracy (taking it literally to mean being ruled by unelected officials). This is one of my major objections to how the European Union currently works. For all intents and purposes, it is not democratic. Institutions of the EU include the European Commission, the Council of the European Union, the European Council, the Court of Justice of the European Union, the European Central Bank, the Court of Auditors, and the European Parliament. Only one of these institutions is elected by the European demos (the parliament) and that institution doesn’t really make any changes to any policies – “the rubber stamp brigade”. The European Council is made up of the President of the European Council (Unelected), President of the European commission (Unelected) and the heads of the member states (elected) and is where quite a lot of the "major" policies come from but not all of the read tape (the European Commission and Parliament). I am happy to be proved wrong but it just seems that the EU, as a whole, is made up of unelected officials who increasing try to make rules that apply to all 28 member states without any consent from the voters in those states – it looks like the rule of “b-euro-crats” (bureaucrats – this version has far too many vowels for a dyslexic person to use). A beurocratic rule which many of us do not agree with but seemingly have to succumb to, a good example for medics is the European Working Time Directive (EWTD) which means that junior doctors only get paid for working 48h a week when they may spend many, many more hours in work. The EWTD has also made training a lot more difficult for many junior doctors and has many implications for how the health service is now run. Is it right that this law was imposed on us without our consent? If we imposed a treatment on a patient without their consent then we would be in very big trouble indeed! I cannot deny that the EU has done some good in the world and I cannot deny that Britain has benefited from being a member. I just wish that we could pay to have access to the markets, while retaining control over the laws in our lands. I want us to be in Europe, as a partner but not as a vassal. In short, I would like us to stay within the EU but with major reforms. I know that any reforms I suggest will not be read by anyone in power and I know they are probably unrealistic but I thought I would put it out there just to see what people think. I would like to see a NICE’er European Union. The National Institute for Clinical Excellence is a Non Departmental Public Body (NDPB), part of the UK Department of Health but a separate organisation (http://www.nice.org.uk/aboutnice/whoweare/who_we_are.jsp). NICE’s role is to advise the UK health service and social services. It does this by assessing the available evidence for treatments/ therapies/ policies etc and then by producing guidelines outlining the evidence and the suggested best course of action. None of these guidelines are enforced by law, for example, as a doctor you do not have to follow the NICE recommendations but if you ignore them and your patient suffers as a consequence then you are likely to be in big trouble with the General Medical Council. So, here would be my recommendations for EU reform: First, we all pay pretty much the same as we do now for access to the European market. We continue with free movement and we keep the European Council but elect the President. This way all the member states can meet up and decide if they want to share any major policies. We all benefit from free movement and we all benefit from a larger free trade area. Second, we get rid of most of the rest of the EU institutions and replace them with an institute a bit like NICE. The European Institute for Policy Excellence (EIPE) would be (hopefully) quite a small department that looks at the best available evidence and then produces guidance on the policy. A shorter executive summary would hopefully also be available for everyday people to read and understand what the policy is about - just like how patients can read NICE executive summaries to understand their condition better. Then any member state could choose to adopt the policy if their parliaments think it worthwhile. This voluntary opt-in system would mean that states retain control of their laws, would probably adopt the policies voluntarily (eventually) and that the European citizens might actually grow to like the EU laws if they can be shown to be evidence based, in the public’s best interests, in the control of the public and not just a law/red tape imposed from above. The European Union should be a place where our elected officials go to debate and agree policies in the best interests of their electorates. There should therefore be an opt-out of any policy for any member state that does not think it will benefit from a policy. This looser union that I would like to see will probably not happen and I do worry that one day we will wake up in the undemocratic united federal states of Europe but this worry should not force us to make an irrational choice now. We should not be voting to "leave the EU at all costs" but we should be voting for reform and a better more co-operative international community. I would not dare suggest who any of you should vote for but I hope you use your vote for change and reform and not more of the same.
about 8 years ago
In NeuroPsychiatry it might be difficult to locate its territory, and find its niche. This might be an uneasy endeavour as its two parent branches neurology and psychiatry are still viable, also it siblings organic psychiatry, behavioural neurology and biological psychiatry are also present. This blogpost attempts to search for the definition and domains of neuropsychiatry. Neuropsychiatry can be defined as the 'biologic face' of mental health (Royal Melbourne Hospital, Neuropsychiatry unit). It is the neurological aspects of psychiatry and the psychiatric aspects of neurology (Pacific Neurpsychiatry Institute). It is not a new term. Many physicians used to brand themselves as neuropsychiatrists at the rise of the twentieth century. It has been looked upon with a sense of unease as a hybrid branch. Also, it was subject to pejorative connotations, as the provenance of amateurs in both parent disciplines (Lishman, 1987). The foundational claim is that 'all' mental disorders are disorders of the brain' (Berrios and Marková, 2002). The American NeuroPsychiatric Association (ANPA) defines it as 'the integrated study of psychiatric and neurologic disorders' (ANPA, 2013). The overlap between neuropsychiatry and biological psychiatry was observed (Trimble and George, 2010) as the domain of enquiry of the first and the approach of the second will meet at point. Berrios and Marková seemed to have focused on the degree of conversion among biological psychiatry, organic psychiatry, neuropsychiatry and behavioural neurology. They stated that they share the same foundational claims (FCcs): (1) mental disorder is a disorder of the brain; (2) reasons are not good enough as causes of mental disorder; and (3) biological psychiatry and its congeners have the patrimony of scientific truth. They further elaborated that the difference is primarily due to difference in historic origins. (D'haenen et al., 2002). The American Neuropsychiatric Association (ANPA) defines neuropsychiatry as the integrative study of neurological and psychiatric disorders on a clinical level, on a theoretical level; ANPA defines it as the bridge between neuroscience and clinical practice. The interrelation between both specialities is adopted by The Royal Australia and New Zealand College of Psychiatrists as it defines it as a psychiatric subspeciality. This seems to resonate the concept that 'biologisation' of psychiatry is inevitable (Sachdev and Mohan, 2013). The definition according to Gale Encyclopedia encompasses the interface between the two disciplines (Fundukian and Wilson, 2008). In order to acknowledge the wide use of the term 'neuropsychiatry'; the fourth edition of Lishman's Organic Psychiatry, appeared and it was renamed as 'textbook of neuropsychiatry'. The editor stated that the term is not used in its more restrictive sense (David, 2009). Ostow backtracked the origin of biological causes for illness to humoral view of temperament.In the nineteenth century, the differentiation between both did not seem to be apparent. The schism seems to have emerged in the twentieth century. The difficulties that arose with such early adoption of neuronal basis to psychiatric disorders are that they were based on on unsubstantiated beliefs and wild logic rather than scientific substance. (Panksepp, 2004). Folstein stated that Freud and Charcot postulated psychological and social roots for abnormal behaviours, thus differentiating neurology from psychiatry. (David, 2009). The separation may have lead to alienation of doctors on both camps and helped in creating an arbitary division in their scope of knowledge and skills. The re-emergence of interest in neurospsychiatry has been described to be due to the growing sense of discomfort in the lack of acknowledgment of brain disorders when considering psychiatric symptoms (Arciniegas and Beresford, 2001). There is considerable blurring regarding defining the territory and the boundaries of neuropsychiatry. The Royal College of Psychiatrists founded section of Neuropsychiatry in 2008. The major working groups include epilepsy, sleep disorders, brain injury and complex neurodisability. In 1987 the British NeuroPsychiatry Association was established, to address the professional need for distinction, without adopting the concept of formal affiliation with parent disciplinary bodies as the Royal College of Psychiatrists. The ANPA was founded in 1988. It issued training guide for residents. The guide included neurological and psychiatric assessments, interpretation of EEG and brain imaging techniques. With regards to the territory, it included delirium, dementia, psychosis, mood and anxiety disorders due to general medical condition. Neurpsychiatric aspects of psychopharmacologic treatments, epilepsy, neuropsychiatric aspects of traumatic brain injury and stroke. The diagnosis of movement disorders, neurobehavioural disorders, demyelinating disease, intellectual and developmental disorders, as well as sleep disorders was also included. The World Federation of Societies of Biological Psychiatry (WFSBP) was established in Buenos Aires in 1974 to address the rising significance of biological psychiatry and to join local national societies together. The National Institute of Mental Health (NIMH), is currently working on a biologically-based diagnosis, that incorporates neural circuits, cells, molecules to behavioural changes. The diagnostic system - named 'Research Domain Criteria (RDoC) - is agnostic to current classification systems DSM-5 & ICD-10. Especially that the current diagnostic classficiations are mostly based on descriptive rather than neurobiological aetiological basis. (Insel et al., 2010). For example, the ICD-10 F-Code designates the first block to Organic illness, however, it seems to stop short of localisation of the cause of illness apart from the common prefix organic. It also addresses adverse drug events as tardive dyskinesia but stops short of describing it neural correlates. Also, psychosocial roots of mental illness seem to be apparent in aetiologically-based diagnoses as Post-Traumatic Stress Disorder, acute stress reaction, and adjustment disorders, the diagnostic cluster emphasise the necessity of having 'stress'. Other diagnoses seem to draw from the psychodynamic literature, e.g. conversion[dissociative] disorder. The need for neuropsychiatry, has been increasing as the advances in diagnostic imaging and laboratory investigations became more clinically relevant. Nowadays, there are tests as DaT-Scan that can tell the difference between neurocognitive disorder with Lewy Bodies and Parkinson's Disease. Vascular neurocognitive disorders warrant imaging as the rule rather than the exception, vascular depression has been addressed is a separate entity. Frontal Lobe Syndromes have been subdivided into orbitofrontal and dorsolateral (Moore,2008) Much training is needed to address this subspeciality. The early cases that may have stirred up the neurological roots of psychiatric disorders can be backdated to the case of Phineas Gage, and later, the case H.M. The eearlier fruits of adopting a neuropsychiatric perspective can be shown in the writings of Eliot Slater, as he attempted to search for the scientific underpinnings of psychiatry, and helped via seminal articles to highlight the organic aspect of psychiatry. Articles like 'The diagnosis of "Hysteria", where Slater, challenged the common wisdom of concepts like hysteria and conversion, rejecting the social roots of mental illness, and presenting a very strong case for the possibility of organicity, and actual cases of for which 'hysteria' was a plain misdiagnosis was way ahead of its time prior to CT Brain. Slater even challenged the mere existence of the concept of 'hysteria. (Slater, 1965) Within the same decade Alwyn Lishman published his textbook 'Organic Psychiatry' addressing the organic aspects of psychiatric disorders. Around the same time, the pioneers of social/psychological roots of mental illness became under attack. Hans Eysenck, published his book 'Decline and Fall of the Freudian Empire'. Eysenck stated clearly that the case of Anna O. seems to have been mispresented and that she never had 'hysteria' and recovered she actually had 'tuberculous meningitis' and she died of its complications (Eysenck, 1986). To summarise, it seems difficult and may be futile to sharply delineate neurpsychiatry, biological psychiatry, organic psychiatry and behavioural neurology. However, it seems important to learn about the biological psychiatry as an approach and practice neuropsychiatry as a subspeciality. The territory is yet unclear from gross organic lesions as stroke to the potential of encompassing entire psychiatry as the arbitary distinction between 'functional' and 'organic' fades away. Perhaps practice will help to shape the domain of the speciality, and imaging will guide it. To date, the number of post-graduate studies are still low in comparison to the need for such speciality, much more board certification may be needed as well as the currently emerging masters and doctoral degrees. This post is previously posted on bmj doc2doc blogs Bibliography Eysenck, H.J., Decline and Fall of the Freudian Empire, Pelican Series, 1986 German E Berrios, I.S.M., The concept of neuropsychiatry: A historical overview, Journal of Psychosomatic Research, 2002, Vol. 53, pp. 629-638 Kieran O’Driscoll, J.P.L., “No longer Gage”: an iron bar through the head, British Medical Journal, 1998, Vol. 317, pp. 1637-1638 Perminder S. Sachdev, A.M., Neuropsychiatry: Where Are We And Where Do We Go From Here?, Mens Sana Monographs, 2013, Vol. 11(1), pp. 4-15 Slater, E., The Diagnosis of "Hysteria", British Medical Journal, 1965, Vol. 5447(1), pp. 1395–1399 Thomas Insel, Bruce Cuthbert, R.H.M.G.K.Q.C.S.P.W., Research Domain Criteria (RDoC): Toward a New Classification Framework for Research on Mental Disorders, American Journal of Psychiatry, 2010, Vol. 167:7, pp. 748-751 Organic Psychiatry, Anthony S. David, Simon Fleminger, M. D. K. S. L. J. D. M. (ed.), Wiley-Blackwell, 2009 Neuropsychiatry an introductory approach, Arciniegas & Beresford (ed.), Cambridge University Press, 2001 Biological Psychiatry, Hugo D’haenen, J.A. den Boer, P. W. (ed.), John Wiley and Sons, 2010 Gale Encyclopedia of Mental Health, Laurie J. Fundukian, J. W. (ed.), Thomson Gale, 2008 Biological Psychiatry, M. Trimble, M. G. (ed.), Wiley-Blackwell, 2010 Textbook of Neuropsychiatry, Moore, D. P. (ed.), Hodder Arnold, 2008 Textbook of Biological Psychiatry, Panksepp, J. (ed.), John Wiley and Sons, 2004 The American Neuropsychiatric Association Website www.anpaonline.org The Royal Melbourne Neuropsychiatry Unit Website http://www.neuropsychiatry.org.au/ The British Neuropsychiatry Association website www.bnpa.org.uk The Royal College of Psychiatrists website www.rcpsych.ac.uk The World Federation of Societies of Biological Psychiatry website www.wfsbp.org
Dr Emad Sidhom
almost 8 years ago
The largest and longest bone in the human body is the femur, and it is located in the upper leg. The femur connects to the knee at one end and fits into the hip socket at the...
over 6 years ago
Complimentary medicine (CAM) is controversial, especially when it is offered by the NHS! You only have to read the recent health section of the Telegraph to see Max Pemberton and James LeFanu exchanging strong opinions. Most of the ‘therapies’ available on the market have little to no evidence base to support their use and yet, I believe that it has an important role to play in modern medicine. I believe that CAM is useful not because of any voodoo magic water or because the soul of a tiger lives on in the dust of one of its claws but because modern medicine hasn’t tested EVERYTHING yet and because EVERY DOCTOR should be allowed to use a sugar pill or magic water to ease the anguish of the worried well every now and again. The placebo effect is powerful and could be used to help a lot of patients as well as save the NHS a lot of money. I visited my grandfather for a cup of coffee today. As old people tend to do we discussed his life, his life lessons and his health . My grandfather is 80-something years old and worked as a collier underground for about 25 years before rising up through the ranks of management. In his entire life he has been to hospital twice: Once to have his tonsils removed and once to have a TKR – total knee replacement. My granddad maintains that the secret of his good health is good food, plenty of exercise, keeping his mind active and 1 dried Ivy berry every month! He takes the dried ivy berries because a gypsie once told his father that doing so would prevent infection of open wounds; common injuries in those working under ground. It is my granddad’s firm belief that the ivy berries have kept him healthy over the past 60 years, despite significant drinking and a 40 year pack history! My grandfather is the only person I know who takes this quite bizarre and potentially dangerous CAM, but he has done so for over half a century now and has suffered no adverse effects (that we can tell anyway)! This has led me to think about the origin of medicine and the evolution of modern medicine from ancient treatments: Long ago medicine meant ‘take this berry and see what happens’. Today, medicine means ‘take this drug (or several drugs) and see what happens, except we’ll write it down if it all goes wrong’. Just as evidence for modern therapies have been established, is there any known evidence for the ivy berry and what else is it used for? My grandfather gave me a second piece of practical advice this afternoon, in relation to the treatment of open wounds: To stop bleeding cover the wound in a bundle of spiders web. You can collect webs by wrapping them up with a stick, then slide the bundle of webs off the stick onto the wound and hold it in place. If the wound is quite deep then cover the wound in ground white pepper. I have no idea whether these two tips actually work but they reminded me of ‘QuickClot’ (http://www.z-medica.com/healthcare/About-Us/QuikClot-Product-History.aspx) a powder that the British Army currently issues to all its frontline troops for the treatment of wounds. The powder is poured into the wound and it forms a synthetic clot reducing blood loss. This technology has been a life-saver in Afghanistan but is relatively expensive. Supposing that crushed white pepper has similar properties, wouldn’t that be cheaper? While I appreciate that the two are unlikely to have the same level of efficacy, I am merely suggesting that we do not necessarily dismiss old layman’s practices without a little investigation. I intend to go and do a few searches on pubmed and google but just thought I’d put this in the public domain and see if anyone has any corroborating stories. If your grandparents have any rather strange but potentially useful health tips I’d be interested in hearing them. You never know they may just be the treatments of the future!
about 9 years ago
A summary of the role and composition of normal flora, the typical bacterial pathogens causing several common infectious diseases, diagnosis of UTI, and interpretation as to whether a positive blood culture represents true infection or contamination. Bonus points to anyone who can identify the mystery portrait.
over 8 years ago
Going to work in a different country? Different culture? Different language? Avoid getting tripped up as I did!
I grew up in Belgium and went to medical school in Louvain, Belgium. I came to the USA for my internship and selected a small hospital in upstate New York. What an initial culture shock that was! The first problem was the language. I knew enough "school" English to get by, or so I thought. Talking on the phone was the hardest. Initially, the nurses in the hospital thought that I was the most conscientious intern they had ever worked with. When I was on duty and the nurses called me on the phone at night, I would always go to the ward, look over the chart, see the patient and then write a note and orders, rather than just handle things over the phone like all the other interns did when called for rather minor matters. Little did the nurses realize that the reason I would get up in the middle of the night and physically go to the ward was due to the fact that I had no idea what they were talking about. I did not understand a word of what the nurses were telling or asking me on the telephone, especially not when they were using even common American abbreviations, like PRN, QID, LMP etc. [PRN (Latin) means as needed; QID (Latin) means four times a day and LMP means last menstrual period]. That problem rapidly resolved as I began to understand more and more of the English medical terms. However, there is a major difference between understanding day-to-day common English and grasping all the idioms and sayings. A rather amusing anecdote will illustrate that. About two months into my internship, I was on call at night when one of the nurses telephoned me in the early evening. A patient (Mrs X) was having a bad headache and wanted something for it. I was proud that I had understood the problem over the phone and was even more proud that I managed to order something for her headache without having to walk over to the ward. An hour or so later, the same nurse called me for the same patient because she had been constipated and wanted something for it. Again I understood and again I was able to prescribe a laxative over the phone without having to go to see the patient. A while later the same nurse called to let me know that Mrs X was agitated and wanted something for sleep. I understood again and prescribed a sleeping pill. Close to the 11pm shift change the same nurse called me once more: "Dr. LeMaire, I am so sorry to keep bothering you about Mrs X, but she is really a pain in the neck…" Immediately some horrible thought occurred to me. Here is a patient who has a bad headache, is constipated and agitated and now has a pain in her neck. These could all be symptoms of meningitis and here I have been ordering medications over the phone for a potentially serious condition. I broke out in a cold sweat and I told the nurse "I am coming." I ran over to the ward where that patient was hospitalized, went to her room and after introducing myself said "Mrs. X, the nurse tells me that you have a pain in your neck." The rest is history. The patient lodged a complaint about the nurse and me, but we both got off with a minor reprimand and in fact somewhat of a chuckle by the administrator handling the complaint. Such tripping up by the idioms and sayings can of course happen in any language. Be aware! Dr. William LeMaire
DR William LeMaire
almost 7 years ago
Introduction Nutrition during adolescence is of great importance in the quality of life and in school performance. In our society, eating habits are changing due to socio-cultural and family factors, new ideas about self-image and a global food culture. Data published in the National Health Survey (ENSANUT 2006), suggest that excess weight in adolescents between 12 and 19 years, has a national average of 31.9% in Mexico. One in 3 teens have excessive weight in our country. Several studies suggest that foods high in energy density and large portions of food, increase energy consumption and hence weight gain. Bottled soft drinks currently consumed, contribute to the epidemic of obesity and type II diabetes in Mexico. The intake of energy from these drink, represent 21% of the total energy consumption in Mexican adolescents. This, added to the energy from the diet, both contribute to these epidemics. Therefore, the adoption of patterns of consumption of healthy food and drinks for adolescents should be a priority for the population, since a well-balanced diet positively affects the physical and intellectual development of adolescents.
over 12 years ago