Skip to main content
Advertisement

Main menu

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home

User menu

  • Alerts
  • Log in

Search

  • Advanced search
American Journal of Neuroradiology
American Journal of Neuroradiology

American Journal of Neuroradiology

ASHNR American Society of Functional Neuroradiology ASHNR American Society of Pediatric Neuroradiology ASSR
  • Alerts
  • Log in

Advanced Search

  • Home
  • Content
    • Current Issue
    • Accepted Manuscripts
    • Article Preview
    • Past Issue Archive
    • Video Articles
    • AJNR Case Collection
    • Case of the Week Archive
    • Case of the Month Archive
    • Classic Case Archive
  • Special Collections
    • AJNR Awards
    • Low-Field MRI
    • Alzheimer Disease
    • ASNR Foundation Special Collection
    • Photon-Counting CT
    • View All
  • Multimedia
    • AJNR Podcasts
    • AJNR SCANtastic
    • Trainee Corner
    • MRI Safety Corner
    • Imaging Protocols
  • For Authors
    • Submit a Manuscript
    • Submit a Video Article
    • Submit an eLetter to the Editor/Response
    • Manuscript Submission Guidelines
    • Statistical Tips
    • Fast Publishing of Accepted Manuscripts
    • Graphical Abstract Preparation
    • Imaging Protocol Submission
    • Author Policies
  • About Us
    • About AJNR
    • Editorial Board
    • Editorial Board Alumni
  • More
    • Become a Reviewer/Academy of Reviewers
    • Subscribers
    • Permissions
    • Alerts
    • Feedback
    • Advertisers
    • ASNR Home
  • Follow AJNR on Twitter
  • Visit AJNR on Facebook
  • Follow AJNR on Instagram
  • Join AJNR on LinkedIn
  • RSS Feeds

AJNR Awards, New Junior Editors, and more. Read the latest AJNR updates

Research ArticleUltra-High-Field MRI/Imaging of Epilepsy/Demyelinating Diseases/Inflammation/Infection

Multicenter Automated Central Vein Sign Detection Performs as Well as Manual Assessment for the Diagnosis of Multiple Sclerosis

A.R. Manning, V. Letchuman, M.L. Martin, E. Gombos, T. Robert-Fitzgerald, Q. Cao, P. Raza, C.M. O’Donnell, B. Renner, L. Daboul, P. Rodrigues, M. Ramos, J. Derbyshire, C. Azevedo, A. Bar-Or, E. Caverzasi, P.A. Calabresi, B.A.C. Cree, L. Freeman, R.G. Henry, E.E. Longbrake, J. Oh, N. Papinutto, D. Pelletier, R.D. Samudralwar, S. Suthiphosuwan, M.K. Schindler, M. Bilello, J.W. Song, E.S. Sotirchos, N.L. Sicotte, O. Al-Louzi, A.J. Solomon, D.S. Reich, D. Ontaneda, P. Sati, R.T. Shinohara and the NAIMS Cooperative
American Journal of Neuroradiology March 2025, 46 (3) 620-626; DOI: https://doi.org/10.3174/ajnr.A8510
A.R. Manning
aFrom the Penn Statistics in Imaging and Visualization Center (A.R.M., M.L.M., T.R.-F., Q.C., C.M.O., R.T.S.), Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for A.R. Manning
V. Letchuman
bTranslational Neuroradiology Section (V.L., L.D., O.A.-L., D.S.R., P.S.), National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
M.L. Martin
aFrom the Penn Statistics in Imaging and Visualization Center (A.R.M., M.L.M., T.R.-F., Q.C., C.M.O., R.T.S.), Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for M.L. Martin
E. Gombos
cDepartment of Neurology (E.G., B.R., N.L.S., O.A.-L., P.S.), Cedars-Sinai Medical Center, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for E. Gombos
T. Robert-Fitzgerald
aFrom the Penn Statistics in Imaging and Visualization Center (A.R.M., M.L.M., T.R.-F., Q.C., C.M.O., R.T.S.), Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
Q. Cao
aFrom the Penn Statistics in Imaging and Visualization Center (A.R.M., M.L.M., T.R.-F., Q.C., C.M.O., R.T.S.), Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Q. Cao
P. Raza
dCleveland Clinic Lerner College of Medicine (P. Raza, L.D.), Cleveland, Ohio
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for P. Raza
C.M. O’Donnell
aFrom the Penn Statistics in Imaging and Visualization Center (A.R.M., M.L.M., T.R.-F., Q.C., C.M.O., R.T.S.), Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for C.M. O’Donnell
B. Renner
cDepartment of Neurology (E.G., B.R., N.L.S., O.A.-L., P.S.), Cedars-Sinai Medical Center, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
L. Daboul
bTranslational Neuroradiology Section (V.L., L.D., O.A.-L., D.S.R., P.S.), National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland
dCleveland Clinic Lerner College of Medicine (P. Raza, L.D.), Cleveland, Ohio
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for L. Daboul
P. Rodrigues
eQMENTA Inc. (P. Rodrigues, M.R.), Boston, Massachusetts
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for P. Rodrigues
M. Ramos
eQMENTA Inc. (P. Rodrigues, M.R.), Boston, Massachusetts
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
J. Derbyshire
fSection on Functional Imaging Methods (J.D.), Laboratory of Brain and Cognition, National Institute of Mental Health, National Institutes of Health, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
C. Azevedo
gDepartment of Neurology (C.A., D.P.), University of Southern California, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
A. Bar-Or
hDepartment of Neurology (A.B.-O., R.D.S., M.K.S.), University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
E. Caverzasi
i UCSF Weill Institute for Neurosciences (E.C., B.A.C.C., R.G.H., N.P.), Department of Neurology, University of California San Francisco, San Francisco, California
jDepartment of Brain and Behavioral Sciences (E.C.), University of Pavia, Pavia, Italy; Neuroradiology Department, Advanced Imaging and Radiomics Center (E.C.), IRCCS Mondino Foundation, Pavia, Italy
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for E. Caverzasi
P.A. Calabresi
kDepartment of Neurology (P.A.C., E.S.S.), Johns Hopkins University, Baltimore, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for P.A. Calabresi
B.A.C. Cree
i UCSF Weill Institute for Neurosciences (E.C., B.A.C.C., R.G.H., N.P.), Department of Neurology, University of California San Francisco, San Francisco, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for B.A.C. Cree
L. Freeman
lDepartment of Neurology, Dell Medical School (L.F.), The University of Texas at Austin, Austin, Texas
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
R.G. Henry
i UCSF Weill Institute for Neurosciences (E.C., B.A.C.C., R.G.H., N.P.), Department of Neurology, University of California San Francisco, San Francisco, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for R.G. Henry
E.E. Longbrake
mDepartment of Neurology (E.E.L.), Yale University, New Haven, Connecticut
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for E.E. Longbrake
J. Oh
nDivision of Neurology (J.O.), Department of Medicine, St. Michael’s Hospital, University of Toronto, Toronto, Canada
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for J. Oh
N. Papinutto
i UCSF Weill Institute for Neurosciences (E.C., B.A.C.C., R.G.H., N.P.), Department of Neurology, University of California San Francisco, San Francisco, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for N. Papinutto
D. Pelletier
gDepartment of Neurology (C.A., D.P.), University of Southern California, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for D. Pelletier
R.D. Samudralwar
hDepartment of Neurology (A.B.-O., R.D.S., M.K.S.), University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for R.D. Samudralwar
S. Suthiphosuwan
oDepartment of Medical Imaging (S.S.), St. Michael’s Hospital, University of Toronto, Toronto, Canada
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
M.K. Schindler
hDepartment of Neurology (A.B.-O., R.D.S., M.K.S.), University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
M. Bilello
pDepartment of Radiology (M.B., J.W.S.), University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
J.W. Song
pDepartment of Radiology (M.B., J.W.S.), University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
E.S. Sotirchos
kDepartment of Neurology (P.A.C., E.S.S.), Johns Hopkins University, Baltimore, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
N.L. Sicotte
cDepartment of Neurology (E.G., B.R., N.L.S., O.A.-L., P.S.), Cedars-Sinai Medical Center, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
O. Al-Louzi
bTranslational Neuroradiology Section (V.L., L.D., O.A.-L., D.S.R., P.S.), National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland
cDepartment of Neurology (E.G., B.R., N.L.S., O.A.-L., P.S.), Cedars-Sinai Medical Center, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
A.J. Solomon
qDepartment of Neurological Sciences (A.J.S.), Larner College of Medicine at the University of Vermont, Burlington, Vermont
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
D.S. Reich
bTranslational Neuroradiology Section (V.L., L.D., O.A.-L., D.S.R., P.S.), National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
D. Ontaneda
rMellen Center for Multiple Sclerosis (D.O.), Cleveland Clinic, Cleveland, Ohio
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
P. Sati
bTranslational Neuroradiology Section (V.L., L.D., O.A.-L., D.S.R., P.S.), National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, Maryland
cDepartment of Neurology (E.G., B.R., N.L.S., O.A.-L., P.S.), Cedars-Sinai Medical Center, Los Angeles, California
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
R.T. Shinohara
aFrom the Penn Statistics in Imaging and Visualization Center (A.R.M., M.L.M., T.R.-F., Q.C., C.M.O., R.T.S.), Department of Biostatistics, Epidemiology, and Informatics, University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
sCenter for Biomedical Image Computing and Analytics (R.T.S.), University of Pennsylvania Perelman School of Medicine, Philadelphia, Pennsylvania
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • Responses
  • References
  • PDF
Loading

Abstract

BACKGROUND AND PURPOSE: The central vein sign (CVS) is a proposed diagnostic imaging biomarker for multiple sclerosis (MS). The proportion of white matter lesions exhibiting the CVS (CVS+) is higher in patients with MS compared with its radiologic mimics. Evaluation for CVS+ lesions in prior studies has been performed by manual rating, an approach that is time-consuming and has variable interrater reliability. Accurate automated methods would facilitate efficient assessment for CVS. The objective of this study was to compare the performance of an automated CVS detection method with manual rating for the diagnosis of MS.

MATERIALS AND METHODS: 3T MRI was acquired in 86 participants undergoing evaluation for MS in a 9-site multicenter study. Participants presented with either typical or atypical clinical syndromes for MS. An automated CVS detection method was employed and compared with manual rating, including total CVS+ proportion and a simplified counting method in which experts visually identified up to 6 CVS+ lesions by using FLAIR* contrast (a voxelwise product of T2 FLAIR and postcontrast T2*-EPI).

RESULTS: Automated CVS processing was completed in 79 of 86 participants (91%), of whom 28 (35%) fulfilled the 2017 McDonald criteria at the time of imaging. The area under the receiver operating characteristic curve (AUC) for discrimination between participants with and without MS for the automated CVS approach was 0.78 (95% CI: [0.67,0.88]). This was not significantly different from simplified manual counting methods (select6*) (0.80 [0.69,0.91]) or manual assessment of total CVS+ proportion (0.89 [0.82,0.96]). In a sensitivity analysis excluding 11 participants whose MRI exhibited motion artifact, the AUC for the automated method was 0.81 [0.70,0.91], which was not statistically different from that for select6* (0.79 [0.68,0.92]) or manual assessment of total CVS+ proportion (0.89 [0.81,0.97]).

CONCLUSIONS: Automated CVS assessment was comparable to manual CVS scoring for differentiating patients with MS from those with other diagnoses. Large, prospective, multicenter studies utilizing automated methods and enrolling the breadth of disorders referred for suspicion of MS are needed to determine optimal approaches for clinical implementation of an automated CVS detection method.

ABBREVIATIONS:

AUC
area under the curve
CVS
central vein sign
EDSS
Expanded Disability Status Score
GBCA
gadolinium-based contrast agent
IRR
interrater reliability
MIMoSA
Method for InterModal Segmentation Analysis
NAIMS
North American Imaging in MS Cooperative
ROC
receiver operating characteristic
SD
standard deviation
WML
white matter lesion

MS is a chronic neuroinflammatory disease that presents with demyelinating lesions of the central nervous system. MS is often considered in the differential diagnosis in patients with neurologic symptoms and MRI white matter lesions (WMLs), yet can be difficult to distinguish from other white matter disease mimics.1 Lesions that are caused by microvascular ischemia, for instance, may be difficult to differentiate from those caused by MS, as current MRI methods lack diagnostic specificity. The long-standing difficulty in the clinical and radiologic diagnostic differentiation of MS results in diagnostic delay and misdiagnosis associated with unnecessary risk of morbidity and disability.2 For these reasons, accurate diagnostic biomarkers for MS, such as the central vein sign (CVS), an emerging imaging biomarker reflecting the perivenular relationship of lesions that have long been associated with MS,3 are of key interest.

The proportion of WMLs exhibiting the CVS (CVS+) is higher in MS compared with its radiologic mimics.4 To date, many studies have evaluated the proportion of total CVS+ lesions to distinguish between MS and non-MS diagnoses.5⇓-7 Of note, a meta-analysis including articles assessing the CVS on T2*-weighted images and individual patient data from 501 patients with MS reported that the incidence of CVS at the individual lesion level per patient was 74%, with a sensitivity and specificity of 95% and 97%, respectively.6 Common approaches used to evaluate proportions of total CVS+ lesion are limited by the time constraints associated with manual assessment of every lesion for CVS and by interrater variability.8⇓-10 To alleviate these issues, several simplified approaches incorporating more limited manual CVS assessments have been proposed to facilitate CVS adjudication in practice. Among these alternate methods is select6*—the identification of at least 6 CVS+ lesions by using FLAIR* contrast (a voxelwise product of T2 FLAIR and postcontrast T2*-EPI).11,12 Although this method has proved both sensitive and specific in cross-sectional studies,13 these are also not without limitations: most have been single-center and have included participants with established diagnoses of MS of varying duration.

In this study, we included a cohort of participants referred to 9 MS specialty centers for suspicion of MS. We retrospectively assessed the performance of a fully automated and publicly available method for automated assessment of the CVS in comparison to manual rating, including total CVS+ proportion and a simplified counting method (select6*), in which experts visually identified up to 6 CVS+ lesions on FLAIR* imaging.14 This retrospective analysis was conducted on a prospectively recruited and imaged cohort. We hypothesized that automated algorithms may be sensitive, specific, and time-efficient for CVS assessment and may be of future utility for MS diagnosis.

MATERIALS AND METHODS

Study Participants

Individuals were recruited at 9 participating academic medical centers across North America and participated in a single study visit. Participant inclusion criteria included: 1) ages 18 to 65 inclusive, 2) referral to an academic site for a new clinical or radiologic suspicion or diagnosis of multiple sclerosis, 3) cranial MRI scan demonstrating T2-hyperintensities, and 4) ability to provide written informed consent to participate in the study. Exclusion criteria included: 1) use of disease-modifying therapies, 2) treatment with systemic corticosteroids within 4 weeks of enrollment, 3) contraindication to MRI via lack of ability to tolerate the study because of claustrophobia or excessive movement related to tremor, and 4) contraindication to using gadolinium-containing contrast agents (eg, allergy or renal failure).

Diagnosis Adjudication

Participating neurologists at each site considered clinical evaluation, MRIs, and CSF analysis to determine if participants fulfilled the 2017 McDonald criteria. All participants included in this study received a work-up for a diagnosis of MS. The diagnoses were subsequently adjudicated centrally by 3 neurologists, as previously described.15

MRI Data

3T MRI, including 3D T1-weighted MPRAGE, T2 FLAIR, and T2*-weighted 3D echo-planar imaging (T2* 3D-EPI) sequences15 were acquired from 86 participants by using 2 different MRI vendors. Of the 9 participating sites, 3 centers used Philips scanners (Ingenia, Ingenia Elition X, and Achieva dStream), and 6 centers used Siemens scanners (Skyra [n = 3], Prisma Fit [n = 2], Prisma [n = 1]). A T2* 3D-EPI sequence was employed after the administration of a macrocyclic gadolinium chelate at a dose of 0.1 mmol/kg. Details regarding the MRI sequences acquired for this study have been previously published13 and are briefly summarized in the Supplemental Data.

Expert CVS Assessment

Guidelines previously described by the North American Imaging in MS Cooperative (NAIMS)4 were employed for manual CVS rating. Raters at each site were trained by using a standardized data set with previously determined CVS+ and CVS- lesions.8 For proportion-based CVS determination, lesion masks were created via ITK-SNAP by using T2 FLAIR images. These masks were then overlaid onto T2* 3D-EPI acquired postgadolinium injection (postcontrast T2*-EPI), and a trained central rater (L.D.), who was blinded to clinical information, manually reviewed each scan for the presence of CVS across all lesions. For select6* counts, site raters identified up to 6 CVS+ lesions on FLAIR* contrast for their respective sites.12 Interrater reliability (IRR) evaluations were performed by using Cohen κ – evaluating CVS+ and CVS- lesions—as classified by both raters. IRR assessments and side-by-side receiver operating characteristic (ROC) analysis for the diagnosis of MS can be seen in previously published work examining the same data set.16

Image Processing for Automated CVS Assessment

A schematic of the image processing pipeline14 is shown in Fig 1. All images underwent N4-bias correction. T1 MPRAGE images were rigidly aligned to T2 FLAIR images, and these 2 sequences were input into the Method for InterModal Segmentation Analysis (MIMoSA)17,18 to identify white matter lesions. Periventricular lesions were excluded because of the extensive quantity and branching characteristics of veins surrounding the ventricles, following the NAIMS recommendations to exclude lesions with >1 vein or branching veins.4 Confluent lesions were separated by using a previously described technique that works by identifying and removing voxels connecting distinct lesions.19 Lesion masks were subsequently interpolated by using the nearest neighbors and rigidly aligned to the higher-resolution postcontrast T2*-EPI. All lesion masks and processed images were assessed visually to ensure processing quality, and any participants for which no lesions were detected by MIMoSA were excluded.

FIG 1.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 1.

Visualization of the image processing pipeline used from the automated CVS detection method described by Dworkin et al 2018.14 Lesion segmentation (pink box): Red indicates automated lesion segmentation masks generated by using MIMoSA overlaid on FLAIR images. Image processing (green box): T1 MPRAGE (left) and postcontrast T2*-EPI (right) postprocessed images. Vein detection (yellow box): Axial (left) and sagittal (right) views of vesselness maps generated by using the Frangi filter and postcontrast T2*-EPI. Permutation procedure (blue box): Red indicates CVS probability maps generated by the automated method10 overlaid on axial T1 MPRAGE images.

Automated central vein detection14 was then employed to assess the degree of vein presence at the center of each lesion by integrating information from the T1 MPRAGE, T2 FLAIR, and postcontrast T2*-EPI. Vesselness maps were created from the postcontrast T2*-EPI by using a Frangi filter,20 and lesion centers were determined by using a previously published automated method.19 This previously described lesion center detection technique works by identifying regions containing lesion-probability maps most closely resembling the texture of lesion centers. The clustering of voxels containing high vesselness scores at the determined lesion center may be suggestive of a central vein. To ensure the presence of these proposed central veins, a spatial permutation process by using the generated vesselness maps was used to determine whether the presence of a vein at the center of a given lesion occurred more than would be expected by chance. The associated probability of vein presence at the center of each lesion in each participant was estimated and averaged across lesions within a participant to yield a participant-level CVS score.

Statistical Analysis

All statistical analyses were conducted in the R software environment, assumed a 5% type I error rate, and employed 2-sided hypothesis tests. ROC analysis was employed to assess the performance of the automated pipeline compared with manual CVS rating. The area under the curve (AUC) was determined for the automated method, the total proportion of CVS+ lesions assessed by the central rater, and the select6* assessments performed at each participating site. DeLong tests were conducted using the pROC package21 in the R statistical environment to compare the diagnostic performances of CVS assessment methods.

Quality Control Exclusion

To assess the performance of the automated detection method in cases of suboptimal image quality, analysis was conducted both before and after the exclusion of scans with extensive motion artifact. Two research specialists with greater than 2 years of experience (A.R.M., E.G.) visually assessed all scans for image quality. Images were rated based on a previously defined rating scale,14 and images displaying poor signal-to-noise or at least 1 severe artifact obstructing any vessel in the supratentorial white matter were excluded in a sensitivity analysis. Examples with motion artifact and illustration of CVS identification by using these images can be seen in the Supplemental Data.

RESULTS

Study Sample

Images from 86 adults (66 women) with a suspicion of MS were included in the study. Mean age was 45 years (standard deviation [SD] = 12) and average time since symptom onset was 4.2 years (SD = 6.4 years). The median Expanded Disability Status Score (EDSS)22 was 1.5 (range = 0–4). In total, 5250 lesions were identified in our study sample by an expert rater (L.D.) (median = 39, range = 0–232), with 2097 (median = 43, range = 0–192) identified in the MS group and 3153 (median = 38, range: 0–232) in the non-MS group. Additional demographic information is available in the Table.

View this table:
  • View inline
  • View popup

Brief description of demographic information for the 86 participants included in this study

Data from 9 sites, with a median recruitment of 10 participants (range = 4–12), were included in our analysis, with the exclusion of 7 participant scans due to image processing failures. Examples of encountered image processing failures include failure of quality assessment by an MRI physicist (P.S.), absence of post-Gd imaging, or excessive image artifact preventing initial image processing. MS diagnosis was ascertained by local site principle investigators in the 79 participants included in the analysis. Subsequently, a diagnosis of MS was adjudicated in 28 participants. Individuals determined to have clinically isolated syndrome or radiologically isolated syndrome were included in the non-MS group. A schematic illustrating the inclusion and exclusion of participants in these analyses can be seen in Fig 2.

FIG 2.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 2.

Brief schematic of the subject inclusion and exclusion in these analyses. n/a = not applicable.

Automated Lesion Detection

No lesions were detected in 2 participants by the automated pipeline, despite the presence of lesions on their MRIs based on visual assessment. These participants were excluded from the primary analyses. In a sensitivity analysis, these participants were alternatively labeled as not exhibiting the CVS, with an automated score of zero, and results from this analysis are presented in the Supplemental Data.

Quality Control Exclusion

Visual inspection indicated that extensive artifact was present on the MRI from 11 participants due to subject motion. In a sensitivity analysis, these participants were removed, of whom 4 out of 11 met the 2017 McDonald criteria at the time of scanning.

Diagnostic Performance of the CVS

In the 79 participants included in this study, the automated method discriminated between participants with and without MS with an AUC of 0.78 (95% CI: [0.67,0.88]). This was comparable to the manual assessments of select6* (0.80 [0.69,0.91]) and total CVS+ proportion evaluation (0.89 [0.82,0.96]). There was no statistically significant difference in performance between any of the detection methods included in our analysis. The ROC curves associated with each CVS assessment method are shown in Fig 3.

FIG 3.
  • Download figure
  • Open in new tab
  • Download powerpoint
FIG 3.

Left: ROC curve before 11 participants were removed because of motion on postcontrast T2*-EPI (n = 79). Twenty-eight participants met the 2017 McDonald criteria. Right: ROC curve after removal of 11 participants because of extensive motion on postcontrast T2* images (n = 68). Twenty-four participants met 2017 McDonald criteria. For both ROC curves, automated results are displayed in green, Select6* counts in red, and total CVS+ proportions in black. AUC values, with 95% CI, determined for each method with and without motion exclusion are included with each ROC curve.

Results were comparable with the exclusion of the 11 images that displayed poor signal-to-noise ratio; the automated CVS approach discriminated between participants with and without MS with an AUC of 0.81 [0.70,0.91]. Similar performance was observed by using select6* (0.79 [0.68,0.92]) and the total proportion of CVS+ lesions (0.89 [0.81,0.97]). These results, with the inclusion of AUC values with and without motion exclusion, are displayed in Fig 3.

DISCUSSION

In this study, we compared the diagnostic performance of an automated CVS detection pipeline14 with CVS determinations assessed by trained raters via select6* counts and criterion-standard total CVS+ proportion. We found that fully automated CVS detection demonstrated good discriminative ability between patients with and without MS in a multicenter study. Additionally, as the main analysis and secondary sensitivity analysis did not differ significantly, the image quality required for this algorithm is not a clinical limitation. These results address a critical issue in translating this imaging biomarker into clinical practice by eliminating a time-intensive manual imaging assessment susceptible to interrater reliability limitations.

The approach utilized in this study may be of particular interest when compared with both automated and manual alternative methods for CVS detection, in part because of the consistency of its performance across study sites and MRI vendors. While other automated techniques have shown promise in demonstrating the difference in the proportion of CVS+ lesions between MS and its common mimics,23 the method explored in this paper does not necessitate manual lesion segmentation. While these alternative methods would be of interest in a comparison study with the automated method utilized in this paper, the lack of availability, both publicly and commercially, limits the ability to do so at this point. It is worth noting that alternative methods may eventually incorporate automated segmentation; however, such performance has not been documented.

Additionally, because both image preprocessing and lesion segmentation are performed within the pipeline and additional parameter tuning is not required for any of the remaining steps of the algorithm, the performance is largely independent of any user-specific factors. These reasons may introduce additional encouragement for the incorporation of the CVS, in addition to the standing MRI criteria regarding dissemination in time and space, into the diagnostic criteria for MS. Not only could this inclusion provide an additional pathway to diagnosis for individuals with atypical presentations of MS,24 but it may be of increasing interest considering the recent finding of increased microstructural damage noted in CVS+ MS lesions.25

Further, it should be noted that in this study, among others referenced throughout this paper, CVS ratings were determined by trained raters. These standards may be challenging, given the forecasted shortage of specialized neurologists and radiologists in the broader community.26 The importance of specialized training with respect to error and variance in manual assessment is not yet well understood. These factors highlight the need for standardization and automated assessment of CVS in practice.

This study is not without limitations. Prior studies have found CVS probability values by using automated methods to be lower than previously reported CVS+ proportion values for patients with MS and higher for those without MS.14 While it has been theorized that this effect could be due to false-positive CVS lesions via the automated segmentation method, the impact of incorrect manual CVS assessment may also be at play. Further, as demonstrated in the 2 cases with false-negative lesion detection included in this study, automated lesion segmentation may display difficulty in cases with a lower lesion burden. While these cases may be infrequent, because of the low lesion burden and low prevalence based on 2 cases from 79 in our cohort, specifically targeting cases with these characteristics may be of particular interest to future studies.

We anticipate this automated process to maintain a similar level of performance as it is trained with more data; however, we cannot guarantee whether there may be accuracy drift over time.

Additionally, while recent findings suggest an increase in sensitivity of the CVS for MS diagnosis with the administration of gadolinium-based contrast agent (GBCA),15 we expect this automated method to maintain a similar level of performance without the use of GBCAs, as well as with the use of SWI. Exploration of this prediction may be a good direction for future study. Finally, while the CVS has been proposed as a useful tool in the prediction of MS evolution and manifestation27 in addition to its more established diagnostic use such as the 40% rule—in which an MS diagnosis can be confidently favored if >40% of lesions demonstrate a CVS5—it should be noted that CVS+ lesions are not exclusive to a diagnosis of MS. While less prevalent than in MS, other neurologic diseases such as neuromyelitis optica spectrum disorder, Susac syndrome, cerebral small vessel disease, and additional systemic autoimmune diseases have been linked to the presence of CVS+ lesions.28⇓⇓⇓-32

The methods explored in this paper show promise for the diagnostic value and standardization of CVS detection in practice. Further optimization of this technique and the translation of these research methods into practice in the broader community should be the focus of future work. Additionally, the correlations between the CVS and other variables, such as lesion load, atrophy analysis, and EDSS scores, may be a future area of interest considering the link between CVS+ MS lesions and increased levels of tissue damage.25 While the results from this pilot study were encouraging, it is important to acknowledge the limitation introduced by the limited sample size. For this reason, studies may expand upon these findings through the incorporation of larger study samples and the inclusion of multiple time points for imaging acquisition and diagnosis adjudication. Finally, large prospective multicenter studies that include the breadth of disorders referred for suspected MS are needed to determine optimal approaches for utilizing the CVS as a diagnostic biomarker in MS.

Footnotes

  • R.T. Shinohara and P. Sati contributed equally to this article.

  • This research has been supported by the Intramural Research Program of the NINDS, National Institutes of Health (NIH). Research reported in this publication was supported by the NIH under award numbers R01NS112274, R01MH123550, R01MH112847, and U01NS116776.

  • Disclosure forms provided by the authors are available with the full text and PDF of this article at www.ajnr.org.

References

  1. 1.↵
    1. Solomon AJ,
    2. Arrambide G,
    3. Brownlee WJ, et al
    . Differential diagnosis of suspected multiple sclerosis: an updated consensus approach. Lancet Neurol 2023;22:750–68 doi:10.1016/S1474-4422(23)00148-5 pmid:37479377
    CrossRefPubMed
  2. 2.↵
    1. Solomon AJ,
    2. Bourdette DN,
    3. Cross AH, et al
    . The contemporary spectrum of multiple sclerosis misdiagnosis: a multicenter study. Neurology 2016;87:1393–99 doi:10.1212/WNL.0000000000003152 pmid:27581217
    Abstract/FREE Full Text
  3. 3.↵
    1. Fog T
    . On the vessel-plaque relationships in the brain in multiple sclerosis. Acta Neurol Scand Suppl 1963;39(4):258–62 doi:10.1111/j.1600-0404.1963.tb01841.x pmid:14057511
    CrossRefPubMed
  4. 4.↵
    1. Sati P,
    2. Oh J,
    3. Constable RT
    ; NAIMS Cooperative, et al. The central vein sign and its clinical evaluation for the diagnosis of multiple sclerosis: a consensus statement from the North American Imaging in Multiple Sclerosis Cooperative. Nat Rev Neurol 2016;12:714–22 doi:10.1038/nrneurol.2016.166 pmid:27834394
    CrossRefPubMed
  5. 5.↵
    1. George IC,
    2. Sati P,
    3. Absinta M, et al
    . Clinical 3-Tesla FLAIR* MRI improves diagnostic accuracy in multiple sclerosis. Mult Scler 2016;22:1578–86 doi:10.1177/1352458515624975 pmid:26769065
    CrossRefPubMed
  6. 6.↵
    1. Suh CH,
    2. Kim SJ,
    3. Jung SC, et al
    . The “central vein sign” on T2*-weighted images as a diagnostic tool in multiple sclerosis: a systematic review and meta-analysis using individual patient data. Sci Rep 2019;9:18188 doi:10.1038/s41598-019-54583-3 pmid:31796822
    CrossRefPubMed
  7. 7.↵
    1. Castellaro M,
    2. Tamanti A,
    3. Pisani AI, et al
    . The use of the central vein sign in the diagnosis of multiple sclerosis: a systematic review and meta-analysis. Diagnostics (Basel) 2020;10:1025 doi:10.3390/diagnostics10121025
    CrossRefPubMed
  8. 8.↵
    1. Solomon AJ,
    2. Watts R,
    3. Ontaneda D, et al
    . Diagnostic performance of central vein sign for multiple sclerosis with a simplified three-lesion algorithm. Mult Scler 2018;24:750–57 doi:10.1177/1352458517726383 pmid:28820013
    CrossRefPubMed
  9. 9.↵
    1. Campion T,
    2. Smith RJP,
    3. Altmann DR, et al
    . FLAIR* to visualize veins in white matter lesions: a new tool for the diagnosis of multiple sclerosis? Eur Radiology 2017;27:4257–63 doi:10.1007/s00330-017-4822-z pmid:28409356
    CrossRefPubMed
  10. 10.↵
    1. Sinnecker T,
    2. Clarke MA,
    3. Meier D
    ; MAGNIMS Study Group, et al. Evaluation of the central vein sign as a diagnostic imaging biomarker in multiple sclerosis. JAMA Neurol 2019;76:1446–56 doi:10.1001/jamaneurol.2019.2478 pmid:31424490
    CrossRefPubMed
  11. 11.↵
    1. Mistry N,
    2. Abdel-Fahim R,
    3. Samaraweera A, et al
    . Imaging central veins in brain lesions with 3-T T2*-weighted magnetic resonance imaging differentiates multiple sclerosis from microangiopathic brain lesions. Mult Scler 2016;22:1289–96 doi:10.1177/1352458515616700 pmid:26658816
    CrossRefPubMed
  12. 12.↵
    1. Sati P,
    2. George IC,
    3. Shea CD, et al
    . FLAIR*: a combined MR contrast technique for visualizing white matter lesions and parenchymal veins. Radiology 2012;265:926–32 doi:10.1148/radiol.12120208 pmid:23074257
    CrossRefPubMed
  13. 13.↵
    1. Ontaneda D,
    2. Sati P,
    3. Raza P
    ; North American Imaging in MS Cooperative, et al. Central vein sign: a diagnostic biomarker in multiple sclerosis (CAVS-MS) study protocol for a prospective multicenter trial. Neuroimage Clin 2021;32:102834 doi:10.1016/j.nicl.2021.102834 pmid:34592690
    CrossRefPubMed
  14. 14.↵
    1. Dworkin J,
    2. Sati P,
    3. Solomon A, et al
    . Automated integration of multimodal MRI for the probabilistic detection of the central vein sign in white matter lesions. AJNR Am J Neuroradiol 2018;39:1806–13 doi:10.3174/ajnr.A5765 pmid:30213803
    Abstract/FREE Full Text
  15. 15.↵
    1. Daboul L,
    2. O’Donnell CM,
    3. Cao Q, et al
    . Effect of GBCA use on detection and diagnostic performance of the central vein sign: evaluation using a 3-T FLAIR* sequence in patients with suspected multiple sclerosis. AJR Am J Roentgenol 2023;220:115–25 doi:10.2214/AJR.22.27731 pmid:35975888
    CrossRefPubMed
  16. 16.↵
    1. Daboul L,
    2. O’Donnell CM,
    3. Amin M, et al
    . A multicenter pilot study evaluating simplified central vein assessment for the diagnosis of multiple sclerosis. Mult Scler 2024;30:25–34 doi:10.1177/13524585231214360 pmid:38088067
    CrossRefPubMed
  17. 17.↵
    1. Valcarcel AM,
    2. Linn KA,
    3. Khalid F, et al
    . A dual modeling approach to automatic segmentation of cerebral T2 hyperintensities and T1 black holes in multiple sclerosis. Neuroimage Clin 2018;20:1211–21 doi:10.1016/j.nicl.2018.10.013 pmid:30391859
    CrossRefPubMed
  18. 18.↵
    1. Valcarcel AM,
    2. Linn KA,
    3. Vandekar SN, et al
    . MIMoSA: an automated method for intermodal segmentation analysis of multiple sclerosis brain lesions. J Neuroimaging 2018;28:389–98 doi:10.1111/jon.12506 pmid:29516669
    CrossRefPubMed
  19. 19.↵
    1. Dworkin JD,
    2. Linn KA,
    3. Oguz I
    ; North American Imaging in Multiple Sclerosis Cooperative, et al. An automated statistical technique for counting distinct multiple sclerosis lesions. AJNR Am J Neuroradiol 2018;39:626–33 doi:10.3174/ajnr.A5556 pmid:29472300
    Abstract/FREE Full Text
  20. 20.↵
    1. Wells WM,
    2. Colchester A,
    3. Delp S
    1. Frangi AF,
    2. Niessen WJ,
    3. Vincken KL, et al
    . Multiscale vessel enhancement filtering. In: Wells WM, Colchester A, Delp S, eds. Proceedings of Medical Image Computing and Computer-Assisted Intervention (MICCAI’98); October 11–18, 1998; Cambridge, Massachusetts. Springer.
  21. 21.↵
    1. Robin X,
    2. Turck N,
    3. Hainard A, et al
    . pROC: an open-source package for R and S+ to analyze and compare ROC curves. BMC Bioinformatics 2011;12:77 doi:10.1186/1471-2105-12-77 pmid:21414208
    CrossRefPubMed
  22. 22.↵
    1. Kurtzke JF
    . Rating neurologic impairment in multiple sclerosis: an expanded disability status scale (EDSS). Neurology 1983;33:1444–52 doi:10.1212/wnl.33.11.1444 pmid:6685237
    Abstract/FREE Full Text
  23. 23.↵
    1. Maggi P,
    2. Fartaria MJ,
    3. Jorge J, et al
    . CVSnet: a machine learning approach for automated central vein sign assessment in multiple sclerosis. NMR Biomed 2020;33:e4283 doi:10.1002/nbm.4283 pmid:32125737
    CrossRefPubMed
  24. 24.↵
    1. Ontaneda D,
    2. Cohen JA,
    3. Sati P
    . Incorporating the central vein sign into the diagnostic criteria for multiple sclerosis. JAMA Neurol 2023;80:657–58 doi:10.1001/jamaneurol.2023.0717 pmid:37067820
    CrossRefPubMed
  25. 25.↵
    1. Levasseur VA,
    2. Xiang B,
    3. Salter A, et al
    . Stronger microstructural damage revealed in multiple sclerosis lesions with central vein sign by quantitative gradient echo MRI. J Cent Nerv Syst Dis 2022;14:11795735221084842 doi:10.1177/11795735221084842 pmid:35370433
    CrossRefPubMed
  26. 26.↵
    1. Zhang X,
    2. Lin D,
    3. Pforsich H, et al
    . Physician workforce in the United States of America: forecasting nationwide shortages. Hum Resour Health 2020;18:8 doi:10.1186/s12960-020-0448-3 pmid:32029001
    CrossRefPubMed
  27. 27.↵
    1. Abou Mrad T,
    2. Naja K,
    3. Khoury SJ, et al
    . Central vein sign and paramagnetic rim sign: from radiologically isolated syndrome to multiple sclerosis. Eur J Neurol 2023;30:2912–18 doi:10.1111/ene.15922 pmid:37350369
    CrossRefPubMed
  28. 28.↵
    1. Kister I,
    2. Herbert J,
    3. Zhou Y, et al
    . Ultrahigh-field MR (7 T) imaging of brain lesions in neuromyelitis optica. Mult Scler Int 2013;2013:398259 doi:10.1155/2013/398259 pmid:23431447
    CrossRefPubMed
  29. 29.↵
    1. Massacesi L
    . Evaluation by brain MRI of white matter perivenular lesions in inflammatory microangiopathic ischemia and in demyelinating multiple sclerosis lesions [abstract O2217]. Eur J Neurol 2016;23:86
  30. 30.↵
    1. Lummel N,
    2. Boeckh-Behrens T,
    3. Schoepf V, et al
    . Presence of a central vein within white matter lesions on susceptibility weighted imaging: a specific finding for multiple sclerosis? Neuroradiology 2011;53:311–17 doi:10.1007/s00234-010-0736-z pmid:20585764
    CrossRefPubMed
  31. 31.↵
    1. Wuerfel J,
    2. Sinnecker T,
    3. Ringelstein EB, et al
    . Lesion morphology at 7 Tesla MRI differentiates Susac syndrome from multiple sclerosis. Mult Scler 2012;18:1592–99 doi:10.1177/1352458512441270 pmid:22711711
    CrossRefPubMed
  32. 32.↵
    1. Solomon AJ,
    2. Schindler MK,
    3. Howard DB, et al
    . Central vessel sign on 3T FLAIR* MRI for the differentiation of multiple sclerosis from migraine. Ann Clin Transl Neurol 2016;3:82–87 doi:10.1002/acn3.273 pmid:26900578
    CrossRefPubMed
  • Received January 28, 2024.
  • Accepted after revision September 3, 2024.
  • © 2025 by American Journal of Neuroradiology
PreviousNext
Back to top

In this issue

American Journal of Neuroradiology: 46 (3)
American Journal of Neuroradiology
Vol. 46, Issue 3
1 Mar 2025
  • Table of Contents
  • Index by author
  • Complete Issue (PDF)
Advertisement
Print
Download PDF
Email Article

Thank you for your interest in spreading the word on American Journal of Neuroradiology.

NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

Enter multiple addresses on separate lines or separate them with commas.
Multicenter Automated Central Vein Sign Detection Performs as Well as Manual Assessment for the Diagnosis of Multiple Sclerosis
(Your Name) has sent you a message from American Journal of Neuroradiology
(Your Name) thought you would like to see the American Journal of Neuroradiology web site.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Cite this article
A.R. Manning, V. Letchuman, M.L. Martin, E. Gombos, T. Robert-Fitzgerald, Q. Cao, P. Raza, C.M. O’Donnell, B. Renner, L. Daboul, P. Rodrigues, M. Ramos, J. Derbyshire, C. Azevedo, A. Bar-Or, E. Caverzasi, P.A. Calabresi, B.A.C. Cree, L. Freeman, R.G. Henry, E.E. Longbrake, J. Oh, N. Papinutto, D. Pelletier, R.D. Samudralwar, S. Suthiphosuwan, M.K. Schindler, M. Bilello, J.W. Song, E.S. Sotirchos, N.L. Sicotte, O. Al-Louzi, A.J. Solomon, D.S. Reich, D. Ontaneda, P. Sati, R.T. Shinohara, the NAIMS Cooperative
Multicenter Automated Central Vein Sign Detection Performs as Well as Manual Assessment for the Diagnosis of Multiple Sclerosis
American Journal of Neuroradiology Mar 2025, 46 (3) 620-626; DOI: 10.3174/ajnr.A8510

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
0 Responses
Respond to this article
Share
Bookmark this article
Automated vs Manual Central Vein Sign in MS
A.R. Manning, V. Letchuman, M.L. Martin, E. Gombos, T. Robert-Fitzgerald, Q. Cao, P. Raza, C.M. O’Donnell, B. Renner, L. Daboul, P. Rodrigues, M. Ramos, J. Derbyshire, C. Azevedo, A. Bar-Or, E. Caverzasi, P.A. Calabresi, B.A.C. Cree, L. Freeman, R.G. Henry, E.E. Longbrake, J. Oh, N. Papinutto, D. Pelletier, R.D. Samudralwar, S. Suthiphosuwan, M.K. Schindler, M. Bilello, J.W. Song, E.S. Sotirchos, N.L. Sicotte, O. Al-Louzi, A.J. Solomon, D.S. Reich, D. Ontaneda, P. Sati, R.T. Shinohara, the NAIMS Cooperative
American Journal of Neuroradiology Mar 2025, 46 (3) 620-626; DOI: 10.3174/ajnr.A8510
del.icio.us logo Twitter logo Facebook logo Mendeley logo
  • Tweet Widget
  • Facebook Like
  • Google Plus One
Purchase

Jump to section

  • Article
    • Abstract
    • ABBREVIATIONS:
    • MATERIALS AND METHODS
    • RESULTS
    • DISCUSSION
    • Footnotes
    • References
  • Figures & Data
  • Supplemental
  • Info & Metrics
  • Responses
  • References
  • PDF

Related Articles

  • PubMed
  • Google Scholar

Cited By...

  • No citing articles found.
  • Crossref
  • Google Scholar

This article has not yet been cited by articles in journals that are participating in Crossref Cited-by Linking.

More in this TOC Section

  • Automated Lesion Segmentation Software in MS
  • DL Image Reconstruction in T2-Weighted TSE at 7T
  • 7T MRI vasculitis imaging
Show more Ultra-High-Field MRI/Imaging of Epilepsy/Demyelinating Diseases/Inflammation/Infection

Similar Articles

Advertisement

Indexed Content

  • Current Issue
  • Accepted Manuscripts
  • Article Preview
  • Past Issues
  • Editorials
  • Editor's Choice
  • Fellows' Journal Club
  • Letters to the Editor
  • Video Articles

Cases

  • Case Collection
  • Archive - Case of the Week
  • Archive - Case of the Month
  • Archive - Classic Case

More from AJNR

  • Trainee Corner
  • Imaging Protocols
  • MRI Safety Corner
  • Book Reviews

Multimedia

  • AJNR Podcasts
  • AJNR Scantastics

Resources

  • Turnaround Time
  • Submit a Manuscript
  • Submit a Video Article
  • Submit an eLetter to the Editor/Response
  • Manuscript Submission Guidelines
  • Statistical Tips
  • Fast Publishing of Accepted Manuscripts
  • Graphical Abstract Preparation
  • Imaging Protocol Submission
  • Evidence-Based Medicine Level Guide
  • Publishing Checklists
  • Author Policies
  • Become a Reviewer/Academy of Reviewers
  • News and Updates

About Us

  • About AJNR
  • Editorial Board
  • Editorial Board Alumni
  • Alerts
  • Permissions
  • Not an AJNR Subscriber? Join Now
  • Advertise with Us
  • Librarian Resources
  • Feedback
  • Terms and Conditions
  • AJNR Editorial Board Alumni

American Society of Neuroradiology

  • Not an ASNR Member? Join Now

© 2025 by the American Society of Neuroradiology All rights, including for text and data mining, AI training, and similar technologies, are reserved.
Print ISSN: 0195-6108 Online ISSN: 1936-959X

Powered by HighWire