Skip to main content
Advertisement

Main menu

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home

User menu

  • Alerts
  • Log in

Search

  • Advanced search
American Journal of Neuroradiology
American Journal of Neuroradiology

American Journal of Neuroradiology

ASHNR American Society of Functional Neuroradiology ASHNR American Society of Pediatric Neuroradiology ASSR
  • Alerts
  • Log in

Advanced Search

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Follow AJNR on Twitter
  • Visit AJNR on Facebook
  • Follow AJNR on Instagram
  • Join AJNR on LinkedIn
  • RSS Feeds

AJNR Awards, New Junior Editors, and more. Read the latest AJNR updates

Research ArticlePractice Perspectives

Neuroradiology Critical Findings Lists: Survey of Neuroradiology Training Programs

L.S. Babiarz, S. Trotter, V.G. Viertel, P. Nagy, J.S. Lewin and D.M. Yousem
American Journal of Neuroradiology April 2013, 34 (4) 735-739; DOI: https://doi.org/10.3174/ajnr.A3300
L.S. Babiarz
aFrom the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins Medical Institutions, Baltimore, Maryland.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
S. Trotter
aFrom the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins Medical Institutions, Baltimore, Maryland.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
V.G. Viertel
aFrom the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins Medical Institutions, Baltimore, Maryland.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
P. Nagy
aFrom the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins Medical Institutions, Baltimore, Maryland.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
J.S. Lewin
aFrom the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins Medical Institutions, Baltimore, Maryland.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
D.M. Yousem
aFrom the Russell H. Morgan Department of Radiology and Radiological Sciences, Johns Hopkins Medical Institutions, Baltimore, Maryland.
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Info & Metrics
  • Responses
  • References
  • PDF
Loading

Abstract

BACKGROUND AND PURPOSE: The Joint Commission has identified timely reporting of critical results as one of the National Patient Safety Goals. We surveyed directors of neuroradiology fellowships to assess and compare critical findings lists across programs.

MATERIALS AND METHODS: A 3-question survey was e-mailed to directors of neuroradiology fellowships with the following questions: 1) Do you currently have a “critical findings” list that you abide by in your neuroradiology division? 2) How is that list distributed to your residents and fellows for implementation, if at all? and 3) Was this list vetted by neurology, neurosurgery, and otolaryngology departments? Programs with CF lists were asked for a copy of the list. Summary and comparative statistics were calculated.

RESULTS: Fifty-one of 89 (57.3%) programs responded. Twenty-one of 51 (41.2%) programs had CF lists. Lists were distributed during orientation, sent via Web sites and e-mails, and posted in work areas. Eleven of 21 lists were developed internally, and 5 of 21, with the input from other departments. The origin of 5 of 21 lists was unknown. Forty CF entities were seen in 20 submitted lists (mean, 9.1; range, 2–23). The most frequent entities were the following: cerebral hemorrhage (18 of 20 lists), acute stroke (15 of 20), spinal cord compression (15 of 20), brain herniation (12 of 20), and spinal fracture/instability (12 of 20). Programs with no CF lists called clinicians on the basis of “common sense” and “clinical judgment.”

CONCLUSIONS: Less than a half (41.2%) of directors of neuroradiology fellowships that responded have implemented CF lists. CF lists have variable length and content and are predominantly developed by radiology departments without external input.

ABBREVIATIONS:

ASNR
American Society of Neuroradiology
CF
critical findings

Patient safety and elimination of preventable medical errors continue to be major issues in health care and in radiology. Since the Institute of Medicine published its report To Err Is Human: Building a Safer Health System, in which as many as 98,000 annual patient deaths were attributed to medical errors, patient safety initiatives have been implemented across the country with the goal of designing better systems and public health processes.1⇓⇓–4 Effective communication between health care providers has been identified as one of the major culprits resulting in poor outcomes.5⇓–7 In 2011, the Joint Commission added a new National Patient Safety Goal, specific for communication in radiology, requiring “report[ing] critical results of tests and diagnostic procedures on a timely basis.”8 A failure to communicate critical radiographic findings in a timely manner can contribute to significant patient mortality and morbidity and is often the subject of medical malpractice claims against radiologists.9⇓–11 The Practice Guideline for Communication of Diagnostic Imaging Findings released by the American College of Radiology also addresses the issue of effective communication in radiology, emphasizing the need for the following: 1) “tailored” timeliness, 2) satisfactory communication between the radiologist and the referring provider, and 3) minimization of communication errors.12 To improve patient safety and prevent adverse events, organizations have developed algorithmic approaches to reporting and communicating critical radiographic findings based on lists of critical findings.7,13

We conducted a 3-question survey among the directors of neuroradiology fellowship training programs to assess and compare the use of critical findings lists in academic neuroradiology across the country. Our goal was to determine the following: 1) How many of the programs have official critical findings lists and what pathology is included in such lists, 2) how critical findings lists are distributed to the fellows and residents, and 3) whether such lists were developed with input from clinical services. Because the Joint Commission has only recently added a new patient safety goal specific for communication of findings in radiology, we hypothesized that there are more programs without than with critical findings lists.

Materials and Methods

The e-mail addresses of neuroradiology fellowship training program directors in the United States and Canada were obtained from the American Society of Neuroradiology Web site. Each fellowship program director listed by the ASNR was contacted twice in the fall of 2011 (October and November) and then again in the spring of 2012 (April and May) and was asked to voluntarily answer a 3-question survey regarding critical findings lists at their institution. The 3-question survey asked the following: 1) Do you currently have a “critical findings” list that you abide by in your neuroradiology division? 2) How is that list distributed to your residents and fellows for implementation, if at all? and 3) Was this list vetted by neurology, neurosurgery, and otolaryngology departments? A copy of this e-mail-based survey is depicted in Fig 1. Programs with critical findings lists received additional e-mail follow-up communication to solicit a copy of the critical findings list of the institution.

Subject: Critical Findings

Colleagues:

I am currently authoring a manuscript on compliance with our “Neurosurgery-approved” critical findings list of abnormalities for which we must immediately contact our clinicians.

I wanted to ask you:

  1. Do you currently have a “critical findings” list that you abide by in your neuroradiology division?

  2. How is that distributed to your residents and fellows, if at all?

  3. Was it vetted by your Neurology, Neurosurgery, ENT departments?

Dave Yousem

Fig. 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 1.

Three-question survey e-mailed to neuroradiology fellowship program directors.

All e-mail communication with program directors was saved, and summary and comparative statistics were performed for all responses and submitted lists. Programs whose fellowship directors did not complete the survey were excluded from the analysis. We compared the number of programs with and without lists, the method of distribution of the lists to the fellows and residents, and the development of the lists with or without the input of the referring physician nonradiologists. For programs that e-mailed their critical findings lists, we analyzed the contents of the lists, looking for similarities and differences among the pathologies/entities included.

Given the small sample size and wide variability in the responses received, we focused on descriptive analyses and did not perform statistical tests to assess significant difference.

Results

Survey Results

Fifty-one of 89 (57.3%) neuroradiology fellowship program directors listed on the ASNR Web site responded to the survey. Twenty-one of 51 (41.2%) programs that responded had critical findings lists, and 30 of 51 (58.8%) did not. Following additional communication via e-mail, 20 of the 21 (95.2%) programs with critical findings lists sent their lists. The overall participation in the survey is depicted in Fig 2.

Fig. 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
Fig. 2.

Neuroradiology fellowship program directors' responses to the e-mail-based survey concerning critical findings lists used in neuroradiology.

Surveyed neuroradiology fellowship program directors distribute their critical findings lists to their fellows and residents in a number of ways. Ten of 21 (47.6%) distribute their lists during orientation/first week of the rotation, 9 of 21 (42.9%) post lists in work areas, 8 of 21 (38.1%) have lists available on a departmental/divisional Web site, 5 of 21 (23.8%) disseminate lists through e-mails, 4 of 21 (19.0%) include lists in their policy booklets, 2 of 21 (9.5%) present lists during conferences, and 1 of 21 (4.8%) allows access to the critical findings list via an electronic system.

Of the programs that responded to the survey and that have critical findings lists, 3 of 21 (14.3%) vetted the entities on the list with neurology, neurosurgery, or otolaryngology. Two of 21 (16.7%) program lists were drafted by a hospital-wide committee with representatives from all departments, including neurosciences and non-neurosciences. Eleven of 21 (52.4%) lists were composed internally by radiology departments without input from the relevant clinical departments. The origin of the remaining 5 of 21 (23.8%) lists was not clear to the surveyed program directors.

Critical Findings Lists

Forty different critical finding entities were seen in 20 submitted neuroradiology critical findings lists (Table). List lengths ranged from 2 to 23 items and averaged 9.1 items (SD = 5.9; median = 8.5). The top 5 most frequent entities included cerebral hemorrhage (present on 18 of 20 or 90% of the lists), acute stroke (15 of 20 or 75%), spinal cord compression (15 of 20 or 75%), brain herniation (12 of 20 or 60%), and spinal fracture/spinal instability (12 of 20 or 60%). Three of 20 (15%) lists included the “anything clinically important” category.

View this table:
  • View inline
  • View popup

Forty different critical finding entities seen in 20 submitted neuroradiology critical findings lists

At some institutions, “cerebral hemorrhage” appeared as a single entry on the critical findings list and was implicitly or explicitly inclusive of epidural, subdural, subarachnoid, parenchymal, and intraventricular hemorrhage. At other institutions, however, the critical findings lists included a combination of separate entries for the different anatomic locations of acute blood products (4 of 20 programs or 20%). Two of 20 programs (10%) had separate critical findings items for cerebral hemorrhage and active bleeding.

Neuroradiology fellowship training programs with no critical findings lists make the decision to communicate with the referring physicians regarding a finding on the basis of “common sense,” “clinical judgment,” or “word of mouth.” Some of the surveyed programs communicate all positive findings for studies ordered as urgent and/or “stat.”

All program directors with or without critical findings lists say they document their communication with the referring providers or appropriate clinical teams in their official reports.

Discussion

Less than a half, 21 of 51, of neuroradiology fellowship program directors who responded to our survey have critical findings lists in their divisions. There is a great variability in the length and content of the lists with cerebral hemorrhage, acute stroke, spinal cord compression, brain herniation, and spinal fracture/instability being the 5 most frequently included entities. Programs disseminate their critical findings lists in a variety of ways, most commonly by presenting them during orientation activities, by posting critical findings lists near the reading stations, and by promulgating them through Web sites and e-mails. Most neuroradiology critical findings lists are created internally by radiology departments and neuroradiology divisions without the input from the referring providers such as neurology, neurosurgery, or otolaryngology departments.

Our limited communication revealed that training programs with no critical findings lists contact the referring physicians to discuss radiographic findings on the basis of “common sense,” “clinical judgment,” or “word of mouth.” Most interesting, 3 of 20 programs that disclosed their critical findings lists have the “anything clinically important” critical finding category included in their official lists. Adding this category to the list not only can potentially shorten it but can also serve as a reminder that no list can ever be complete and that there will be instances warranting direct communication for pathology not detailed in the official list. All programs with and without critical findings lists document their communication with the referring providers or appropriate clinical teams in their official reports.

By reporting on existing neuroradiology critical findings lists, the goal of our study was to facilitate conversations and ongoing effort within neuroradiology divisions across the country in developing such lists and communication standards that meet the Joint Commission mandates, with the hope of improving patient safety. We did not aim to devise a one-size-fits-all, all-encompassing critical findings list that must be adopted by all institutions. In our opinion, given the practice complexities and diverse workflows of health care centers throughout the country, each institution is uniquely suited to develop its own communication standards to care most effectively for the patient population it serves. Some institutions may decide to approve long detailed lists, and others may decide to favor brevity and depend more on clinical experience and judgment of the reporting neuroradiologist. A critical findings list is simply a tool meant to facilitate clinical decision-making and interdepartment communication, not an absolute guideline in and of itself.

Some institutions have already adopted systems for keeping track of and reporting critical findings.7,13 Anthony et al13 reported on their 4-year experience with a departmental policy that defined critical findings as those that can cause mortality and significant morbidity or significant discrepancies between the preliminary and final interpretations. The authors stratified the severity of the critical findings on the basis of the perceived risk to the patient, which determined the mode and timing of communication, and introduced an “escalation process” to ensure timely communication that starts with the referring/covering physician and escalates to the attending physician, chief of service, department chair, and chief medical officer. The adherence to this policy increased with time, from 28.6% in 2006 to 90.4% in 2010, in a case mix in which approximately 10% of studies contained critical findings. Failure to properly document the communication and failure to communicate the findings in the predetermined time frame were the 2 main causes behind the noncompliance.13

Other authors have written extensively on the relationship between clarity and effectiveness of physician-to-physician and physician-to-patient communication and patient safety.5,6,9⇓–11,14 From the medical-legal point of view, as many as 80% of medical malpractice cases may result from or at least involve a breakdown in communication.10,15,16 Both the American College of Radiology and the European Association of Radiology recognize the radiologist's duty to communicate effectively and in a timely fashion and to limit communication errors.10

Critical findings lists can result in unexpected medicolegal and patient health consequences. On the basis of the clinical context, a radiologist may decide to use his or her clinical judgment not to call about a critical finding included in the official list. If later this specific finding is linked to significant patient morbidity or mortality, that radiologist could potentially face an indefensible legal battle. At the extreme, an entity misreported as a critical finding on a study may prompt further clinical work-up. If such a work-up results in a complication leading to patient morbidity, the critical findings list, designed to improve patient safety, could inadvertently become the cause of poor outcome.

Our study has a number of limitations. Similar to other studies based on voluntary surveys, our study had a low participation rate. Only 57.3% (51 of 89) of neuroradiology fellowship directors responded to our survey; and of those whose programs use critical findings lists, 95.2% (20 of 21) submitted their lists for analysis. The ASNR Web site, from which we obtained the list of the current fellowship directors, may not have had the most up-to-date information, including e-mail addresses. Our study sampled academic medical centers with neuroradiology fellowship programs but did not look at the practices of radiology groups or academic medical centers without neuroradiology fellowships. When composing an all-inclusive list of critical findings list entities, we used our clinical discretion and experience in the field to merge some of the pathology categories (eg, hemorrhage = hematoma = bleed). For example, we combined spinal fracture with spinal instability and arteriovenous malformation with aneurysm.

The goal of our study was neither to devise an all-encompassing critical findings list nor to propose a standardized approach for reporting of clinically significant radiographic findings. By sharing our findings, we hope to facilitate the ongoing conversation and effort within neuroradiology divisions across the country in developing critical findings lists and communication standards leading to improvement in patient safety.

Conclusions

We conducted a 3-question survey among the directors of neuroradiology fellowship training programs to assess and compare critical findings lists across programs. Of the surveyed neuroradiology programs that responded, less than a half (41.2%) had critical findings lists that they disseminated during orientation, via Web sites and e-mails, and posting at workstations. Most neuroradiology critical findings lists were created by radiology departments without the input from neurology, neurosurgery, or otolaryngology departments. There was a great variability in the length and content of the lists with the most common entities being cerebral hemorrhage, acute stroke, spinal cord compression, brain herniation, and spinal fracture/instability. Training programs with no critical findings lists contacted the referring physicians and discussed radiographic findings based on “common sense,” “clinical judgment,” or “word of mouth.”

Footnotes

  • Disclosures: Paul Nagy—UNRELATED: Payment for Lectures (including service on Speakers Bureaus): Thomas Jefferson University Grand Rounds.

  • Paper previously presented at: 50th Annual Meeting of the American Society of Neuroradiology and the Foundation of the ASNR Symposium, April 21–26, 2012; New York, New York.

References

  1. 1.↵
    1. Kohn LT,
    2. Corrigan JM,
    3. Donaldson MS
    , eds, for the Committee on Quality of Health Care in America, Institute of Medicine. To Err is Human: Building a Safer Health System. Washington, DC: National Academies Press; 1999
  2. 2.↵
    1. Thrall JH
    . Quality and safety revolution in health care. Radiology 2004;233:3–6
    CrossRefPubMed
  3. 3.↵
    1. Choksi VR,
    2. Marn C,
    3. Piotrowski MM,
    4. et al
    . Illustrating the root-cause-analysis process: creation of a safety net with a semiautomated process for the notification of critical findings in diagnostic imaging. J Am Coll Radiol 2005;2:768–76
    CrossRefPubMed
  4. 4.↵
    1. Johnson CD,
    2. Miranda R,
    3. Aakre KT,
    4. et al
    . Process improvement: what is it, why is it important, and how is it done? AJR Am J Roentgenol 2010;194:461–68
    CrossRefPubMed
  5. 5.↵
    1. Greenberg CC,
    2. Regenbogen SE,
    3. Studdert DM,
    4. et al
    . Patterns of communication breakdowns resulting in injury to surgical patients. J Am Coll Surg 2007;204:533–40
    CrossRefPubMed
  6. 6.↵
    1. Dunn AS,
    2. Markoff B
    . Physician-physician communication: what's the hang-up? J Gen Intern Med 2009;24:437–39
    CrossRefPubMed
  7. 7.↵
    1. Towbin AJ,
    2. Hall S,
    3. Moskovitz J,
    4. et al
    . Creating a comprehensive customer service program to help convey critical and acute results of radiology studies. AJR Am J Roentgenol 2011;196:W48–51
    CrossRefPubMed
  8. 8.↵
    The Joint Commission. National Patient Safety Goals. http://www.jointcommission.org/assets/1/18/2011–2012_npsg_presentation_final_8–4-11.pdf. Accessed August 30, 2012
  9. 9.↵
    1. Berlin L
    . Communicating findings of radiologic examinations: whither goest the radiologist's duty? AJR Am J Roentgenol 2002;178:809–15
    PubMed
  10. 10.↵
    1. Garvey CJ,
    2. Connolly S
    . Radiology reporting–where does the radiologist's duty end? Lancet 2006;367:443–45
    CrossRefPubMed
  11. 11.↵
    1. Berlin L
    . Communicating results of all radiologic examinations directly to patients: has the time come? AJR Am J Roentgenol 2007;189:1275–82
    CrossRefPubMed
  12. 12.↵
    American College of Radiology. ACR Practice Guideline for Communication of Diagnostic Imaging Findings. http://www.acr.org/∼/media/C5D1443C9EA4424AA12477D1AD1D927D.pdf. Accessed January 21, 2012
  13. 13.↵
    1. Anthony SG,
    2. Prevedello LM,
    3. Damiano MM,
    4. et al
    . Impact of a 4-year quality improvement initiative to improve communication of critical imaging test results. Radiology 2011;259:802–07
    CrossRefPubMed
  14. 14.↵
    1. Donnelly LF,
    2. Strife JL
    . Establishing a program to promote professionalism and effective communication in radiology. Radiology 2006;238:773–79
    CrossRefPubMed
  15. 15.↵
    1. Levinson W
    . Physician-patient communication: a key to malpractice prevention. JAMA 1994;272:1619–20
    CrossRefPubMed
  16. 16.↵
    1. Brenner RJ,
    2. Lucey LL,
    3. Smith JJ,
    4. et al
    . Radiology and medical malpractice claims: a report on the practice standards claims survey of the Physician Insurers Association of America and the American College of Radiology. AJR Am J Roentgenol 1998;171:19–22
    PubMed
  • Received July 4, 2012.
  • Accepted after revision July 5, 2012.
  • © 2013 by American Journal of Neuroradiology
View Abstract
PreviousNext
Back to top

In this issue

American Journal of Neuroradiology: 34 (4)
American Journal of Neuroradiology
Vol. 34, Issue 4
1 Apr 2013
  • Table of Contents
  • Index by author
Advertisement
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Neuroradiology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Neuroradiology Critical Findings Lists: Survey of Neuroradiology Training Programs
(Your Name) has sent you a message from American Journal of Neuroradiology
(Your Name) thought you would like to see the American Journal of Neuroradiology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Cite this article
L.S. Babiarz, S. Trotter, V.G. Viertel, P. Nagy, J.S. Lewin, D.M. Yousem
Neuroradiology Critical Findings Lists: Survey of Neuroradiology Training Programs
American Journal of Neuroradiology Apr 2013, 34 (4) 735-739; DOI: 10.3174/ajnr.A3300

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
0 Responses
Respond to this article
Share
Bookmark this article
Neuroradiology Critical Findings Lists: Survey of Neuroradiology Training Programs
L.S. Babiarz, S. Trotter, V.G. Viertel, P. Nagy, J.S. Lewin, D.M. Yousem
American Journal of Neuroradiology Apr 2013, 34 (4) 735-739; DOI: 10.3174/ajnr.A3300
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Purchase

Jump to section

  • Article
    • Abstract
    • ABBREVIATIONS:
    • Materials and Methods
    • Results
    • Discussion
    • Conclusions
    • Footnotes
    • References
  • Figures & Data
  • Info & Metrics
  • Responses
  • References
  • PDF

Related Articles

  • No related articles found.
  • PubMed
  • Google Scholar

Cited By...

  • Critical Findings: Timing of Notification in Neuroradiology
  • Crossref
  • Google Scholar

This article has not yet been cited by articles in journals that are participating in Crossref Cited-by Linking.

More in this TOC Section

  • Qualifying Certainty in Radiology Reports through Deep Learning–Based Natural Language Processing
  • Am I Ready to Be an Independent Neuroradiologist? Objective Trends in Neuroradiology Fellows' Performance during the Fellowship Year
  • Displaying Multiphase CT Angiography Using a Time-Variant Color Map: Practical Considerations and Potential Applications in Patients with Acute Stroke
Show more PRACTICE PERSPECTIVES

Similar Articles

Advertisement

Indexed Content

  • Current Issue
  • Accepted Manuscripts
  • Article Preview
  • Past Issues
  • Editorials
  • Editor's Choice
  • Fellows' Journal Club
  • Letters to the Editor
  • Video Articles

Cases

  • Case Collection
  • Archive - Case of the Week
  • Archive - Case of the Month
  • Archive - Classic Case

More from AJNR

  • Trainee Corner
  • Imaging Protocols
  • MRI Safety Corner
  • Book Reviews

Multimedia

  • AJNR Podcasts
  • AJNR Scantastics

Resources

  • Turnaround Time
  • Submit a Manuscript
  • Submit a Video Article
  • Submit an eLetter to the Editor/Response
  • Manuscript Submission Guidelines
  • Statistical Tips
  • Fast Publishing of Accepted Manuscripts
  • Graphical Abstract Preparation
  • Imaging Protocol Submission
  • Evidence-Based Medicine Level Guide
  • Publishing Checklists
  • Author Policies
  • Become a Reviewer/Academy of Reviewers
  • News and Updates

About Us

  • About AJNR
  • Editorial Board
  • Editorial Board Alumni
  • Alerts
  • Permissions
  • Not an AJNR Subscriber? Join Now
  • Advertise with Us
  • Librarian Resources
  • Feedback
  • Terms and Conditions
  • AJNR Editorial Board Alumni

American Society of Neuroradiology

  • Not an ASNR Member? Join Now

© 2025 by the American Society of Neuroradiology All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Print ISSN: 0195-6108 Online ISSN: 1936-959X

Powered by HighWire