Symptoms, endoscopy and histology have been proposed as therapeutic targets in ulcerative colitis (UC). Observational studies suggest that the achievement of histologic remission may be associated with a lower risk of complications, compared with the achievement of endoscopic remission alone. The actiVE ulcerative colitis, a RanDomIsed Controlled Trial (VERDICT) aims to determine the optimal treatment target in patients with UC.
In this multicentre, prospective randomised study, 660 patients with moderate to severe UC (Mayo rectal bleeding subscore [RBS] ≥1; Mayo endoscopic score [MES] ≥2) are randomly assigned to three treatment targets: corticosteroid-free symptomatic remission (Mayo RBS=0) (group 1); corticosteroid-free endoscopic remission (MES ≤1) and symptomatic remission (group 2); or corticosteroid-free histologic remission (Geboes score <2B.0), endoscopic remission and symptomatic remission (group 3). Treatment is escalated using vedolizumab according to a treatment algorithm that is dependent on the patient’s baseline UC therapy until the target is achieved at weeks 16, 32 or 48. The primary outcome, the time from target achievement to a UC-related complication, will be compared between groups 1 and 3 using a Cox proportional hazards model.
The study was approved by ethics committees at the country level or at individual sites as per individual country requirements. A full list of ethics committees is available on request. Study results will be disseminated in peer-reviewed journals and at scientific meetings.
EudraCT: 2019-002485-12; NCT04259138.
The study aimed to compare the risk of gastrointestinal infections among patients with and without metabolic dysfunction-associated fatty liver disease (MAFLD).
This was a population-based, retrospective, observational study using data from the National Inpatient Sample (NIS), the largest all-payer US inpatient care database.
Hospitalisation of adults aged ≥18 years old admitted in 2020 was identified using the NIS. Patients were stratified by the presence and absence of MAFLD.
26.4 million adults aged ≥18 years old were included in the study. Patients younger than 18 and those with missing demographic or mortality data were excluded.
Primary outcome was to assess the overall risk of gastrointestinal infections in patients with and without MAFLD. Secondary outcomes were demographics and comorbidities stratified by the presence or absence of gastrointestinal infection, and the risk of specific gastrointestinal pathogens.
Of 26.4 million patients admitted in 2020, 755 910 (2.85%) had the presence of MAFLD. There was a higher prevalence of bacterial gastrointestinal infections in patients with MAFLD than those without (1.6% vs 0.9%, p<0.001). The incidence of Clostridioides difficile (1.3% vs 0.8%, p<0.001), Escherichia coli (0.3% vs 0.01%, p<0.001), and Salmonella (0.07% vs 0.03%, p<0.001) was higher in patients with MAFLD. The presence of MAFLD was associated with higher odds of developing gastrointestinal infections (adjusted OR (aOR) –1.75, 95% CI –1.68 to 1.83, p<0.001). After adjusting for confounders, results remained statistically significant (aOR –1.36, 95% CI - 1.30-1.42, p<0.001).
Even after adjusting for confounding factors, our study demonstrates an increased risk of gastrointestinal infections in patients with MAFLD, specifically of C. difficile, E. coli, and Salmonella. The immune and microbiota changes seen within MAFLD potentially contribute to the increased risk of gastrointestinal infections.
Colorectal cancer (CRC) is often accompanied by increased excretion of hydrogen sulfide (H2S). This study aimed to explore the value of exhaled H2S in the diagnosis of CRC.
A total of 80 people with normal colonoscopy results and 57 patients with CRC were enrolled into the present observational cohort study. Exhaled oral and nasal H2S were detected by Nanocoulomb breath analyser. Results were compared between the two groups. Receiver operating characteristic (ROC) curves were analysed and area under the curves (AUCs) were calculated to assess the diagnostic value of exhaled H2S. Meanwhile, the clinicopathological features, including gender, lesion location and tumour staging of patients with CRC, were also collected and analysed.
The amount of exhaled H2S from patients with CRC was significantly higher than that of those with normal colonoscopy results. The ROC curve showed an AUC value of 0.73 and 0.71 based on oral and nasal H2S detection, respectively. The exhaled H2S in patients with CRC was correlated with gender, lesion location and tumour progression, including depth of invasion, lymphatic metastasis and TNM (Tumor, Lymph Nodes, Metastasis) staging.
Exhaled H2S analysis is a convenient and non-invasive detection method for diagnosing CRC, suggesting a potential role in population screening for CRC.
The environmental trigger behind the increasing prevalence of coeliac disease is not known. One suggested cause is iron deficiency, which is common in coeliac disease. We aimed to evaluate this possible association with Mendelian randomisation (MR), which under certain assumptions can suggest a causal relationship.
We conducted a two-sample MR study examining the relationship between single nucleotide polymorphisms (SNPs) associated with iron status and the presence of coeliac disease. The SNPs were drawn from a meta-analysis of three genome-wide association studies (GWAS). The association between these SNPs and coeliac disease was assessed using GWAS summary statistics from the UK Biobank. This consists of 336 638 white British individuals, 1855 with coeliac disease. We performed an MR Egger test for pleiotropy and assessed the plausibility of the assumptions of MR to evaluate for possible causality.
There were four SNPs strongly associated with systemic iron status. These were not associated with known risk factors for coeliac disease. All four SNPs were available in the UK Biobank coeliac disease summary statistics. Harmonising exposure and outcome associations, we found that higher iron status was negatively associated with risk of coeliac disease (OR per 1 SD increase in serum iron: 0.65, 95% CI 0.47 to 0.91). Leave-one-out analyses had consistent results, and no single SNP drove the association. All three assumptions of MR appeared plausible.
We found that genetically lower iron levels were associated with an increased risk of coeliac disease. Our findings highlight a potential opportunity for coeliac disease prevention.
Despite research, there are still controversial areas in the management of Crohn’s disease (CD).
To establish practical recommendations on using anti-tumour necrosis factor (TNF) drugs in patients with moderate-to-severe CD.
Clinical controversies in the management of CD using anti-TNF therapies were identified. A comprehensive literature review was performed, and a national survey was launched to examine current clinical practices when using anti-TNF therapies. Their results were discussed by expert gastroenterologists within a nominal group meeting, and a set of statements was proposed and tested in a Delphi process.
Qualitative study. The survey and Delphi process were sent to 244 CD-treating physicians (response rate: 58%). A total of 14 statements were generated. All but two achieved agreement. These statements cover: (1) use of first-line non-anti-TNF biological therapy; (2) role of HLA-DQA1*05 in daily practice; (3) attitudes in primary non-response and loss of response to anti-TNF therapy due to immunogenicity; (4) use of ustekinumab or vedolizumab if a change in action mechanism is warranted; (5) anti-TNF drug level monitoring; (6) combined therapy with an immunomodulator.
This document sought to pull together the best evidence, experts’ opinions, and treating physicians’ attitudes when using anti-TNF therapies in patients with CD.
Colorectal cancer (CRC) has a significant role in cancer-related mortality. Colonoscopy, combined with adenoma removal, has proven effective in reducing CRC incidence. However, suboptimal colonoscopy quality often leads to missed polyps. The impact of artificial intelligence (AI) on adenoma and polyp detection rate (ADR, PDR) is yet to be established.
We conducted a randomised controlled trial at Sahlgrenska University Hospital in Sweden. Patients underwent colonoscopy with or without the assistance of AI (AI-C or conventional colonoscopy (CC)). Examinations were performed with two different AI systems, that is, Fujifilm CADEye and Medtronic GI Genius. The primary outcome was ADR.
Among 286 patients, 240 underwent analysis (average age: 66 years). The ADR was 42% for all patients, and no significant difference emerged between AI-C and CC groups (41% vs 43%). The overall PDR was 61%, with a trend towards higher PDR in the AI-C group. Subgroup analysis revealed higher detection rates for sessile serrated lesions (SSL) with AI assistance (AI-C 22%, CC 11%, p=0.004). No difference was noticed in the detection of polyps or adenomas per colonoscopy. Examinations were most often performed by experienced endoscopists, 78% (n=86 AI-C, 100 CC).
Amidst the ongoing AI integration, ADR did not improve with AI. Particularly noteworthy is the enhanced detection rates for SSL by AI assistance, especially since they pose a risk for postcolonoscopy CRC. The integration of AI into standard colonoscopy practice warrants further investigation and the development of improved software might be necessary before enforcing its mandatory implementation.
Coeliac disease (CD) diagnosis generally depends on histological examination of duodenal biopsies. We present the first study analysing the concordance in examination of duodenal biopsies using digitised whole-slide images (WSIs). We further investigate whether the inclusion of immunoglobulin A tissue transglutaminase (IgA tTG) and haemoglobin (Hb) data improves the interobserver agreement of diagnosis.
We undertook a large study of the concordance in histological examination of duodenal biopsies using digitised WSIs in an entirely virtual reporting setting. Our study was organised in two phases: in phase 1, 13 pathologists independently classified 100 duodenal biopsies (40 normal; 40 CD; 20 indeterminate enteropathy) in the absence of any clinical or laboratory data. In phase 2, the same pathologists examined the (re-anonymised) WSIs with the inclusion of IgA tTG and Hb data.
We found the mean probability of two observers agreeing in the absence of additional data to be 0.73 (±0.08) with a corresponding Cohen’s kappa of 0.59 (±0.11). We further showed that the inclusion of additional data increased the concordance to 0.80 (±0.06) with a Cohen’s kappa coefficient of 0.67 (±0.09).
We showed that the addition of serological data significantly improves the quality of CD diagnosis. However, the limited interobserver agreement in CD diagnosis using digitised WSIs, even after the inclusion of IgA tTG and Hb data, indicates the importance of interpreting duodenal biopsy in the appropriate clinical context. It further highlights the unmet need for an objective means of reproducible duodenal biopsy diagnosis, such as the automated analysis of WSIs using artificial intelligence.
This study aimed to develop and validate robust predictive models for patients with oesophageal cancer who achieved a pathological complete response (pCR) and those who did not (non-pCR) after neoadjuvant therapy and oesophagectomy.
Clinicopathological data of 6517 primary oesophageal cancer patients who underwent neoadjuvant therapy and oesophagectomy were obtained from the National Cancer Database for the training cohort. An independent cohort of 444 Chinese patients served as the validation set. Two distinct multivariable Cox models of overall survival (OS) were constructed for pCR and non-pCR patients, respectively, and were presented using web-based dynamic nomograms (graphical representation of predicted OS based on the clinical characteristics that a patient could input into the website). The calibration plot, concordance index and decision curve analysis were employed to assess calibration, discrimination and clinical usefulness of the predictive models.
In total, 13 and 15 variables were used to predict OS for pCR and non-pCR patients undergoing neoadjuvant therapy followed by oesophagectomy, respectively. Key predictors included demographic characteristics, pretreatment clinical stage, surgical approach, pathological information and postoperative treatments. The predictive models for pCR and non-pCR patients demonstrated good calibration and clinical utility, with acceptable discrimination that surpassed that of the current tumour, node, metastases staging system.
The web-based dynamic nomograms for pCR (https://predict-survival.shinyapps.io/pCR-eso/) and non-pCR patients (
Mpox is a viral infection caused by the monkeypox virus, a member of the Poxviridae family and Orthopoxvirus genus. Other well-known viruses of the Orthopoxvirus genus include the variola virus (smallpox), cowpox virus and vaccinia virus. Although there is a plethora of research regarding the dermatological and influenza-like symptoms of mpox, particularly following the 2022 mpox outbreak, more research is needed on the gastrointestinal (GI) effects.
This systematic review is to outline the GI manifestations of the monkeypox virus.
The authors conducted this systematic review using guidelines outlined in the Preferred Reporting Items for Systematic Reviews and Meta-Analyses. A search was conducted through the PubMed, EMBASE and MEDLINE databases from January 1958 to June 2023. The authors selected English language papers that discussed the GI symptoms in mpox patients. A manual search was also conducted in the reference sections of these publications for other relevant papers.
33 papers involving 830 patients were selected for this review. The GI manifestations in mpox patients are proctitis, vomiting, diarrhoea, rectal pain, nausea, tenesmus, rectal bleeding and abdominal pain. Although various papers explored transmission routes, one paper established a direct connection between anal-receptive sex transmission route and the development of a GI complication (proctitis). Another study reported that the mode of transmission could potentially impact the occurrence of GI symptoms and severity of the disease. The reviewed papers did not discover a relation between the severity of dermatological and influenza-like symptoms and the GI manifestations mentioned.
This systematic review confirms that GI manifestations are observed in mpox patients. GI symptoms of mpox are crucial for gastroenterologists and other healthcare professionals to recognise in order to address patient discomfort and further understand the pathophysiology of the virus.
The healthcare burden of alcohol-related liver disease (ARLD) is increasing. ARLD and alcohol use disorder (AUD) is best managed by reduction or cessation of alcohol use, but effective treatments are lacking. We tested whether people with ARLD and AUD admitted to hospital could be recruited to and retained in a trial of Functional Imagery Training (FIT), a psychological therapy that uses mental imagery to reduce alcohol craving. We conducted a multicentre randomised pilot trial of treatment as usual (TAU) versus FIT+TAU in people admitted to hospital with ARLD and AUD.
Participants were randomised to TAU (a single session of brief intervention) or FIT+TAU (TAU with one hospital-based FIT session then eight telephone sessions over 6 months). Pilot outcomes included recruitment rate and retention at day 180. Secondary outcomes included fidelity of FIT delivery, alcohol use, and severity of alcohol dependence.
Fifty-four participants (mean age 49; 63% male) were recruited and randomised, 28 to TAU and 26 to FIT+TAU. The retention rate at day 180 was 43%. FIT was delivered adequately by most alcohol nurses. 50% of intervention participants completed FIT sessions 1 and 2. There were no differences in alcohol use or severity of alcohol dependence between treatment groups at day 180.
Participants with ARLD and AUD could be recruited to a trial of FIT versus FIT+TAU. However, retention at day 180 was suboptimal. Before conducting a definitive trial of FIT in this patient group, modifications in the intervention and recruitment/retention strategy must be tested.
Low anterior resection syndrome (LARS) is one of the most common functional impairments after rectal cancer surgery with a high impact on quality of life. The Pre-Operative LARS score (POLARS) nomogram and its online tool has been developed to predict the degree of postoperative LARS. The aim of this study was to analyse how accurately the POLARS score could predict LARS scores when compared with actual patient-reported LARS (PR-LARS) scores in a population-based Swedish cohort.
This retrospective cohort study included patients who underwent curative rectal cancer surgery between 2007 and 2013 in Stockholm County and were identified using the Swedish Colorectal Cancer Registry (SCRCR). Information regarding preoperative risk factors, patient and treatment characteristics, and presence of LARS postoperatively were collected from patient charts, SCRCR and patient questionnaires. The POLARS model formula was used to predict LARS scores, which then were compared with the actual PR-LARS scores. Individual LARS score differences between the two estimates were shown with a modified Bland-Altman plot of difference.
The cohort included 477 patients, of whom 359 (75%) of patients were categorised as having no/minor LARS based on the POLARS score. The correctly identified patients by the POLARS score were 80/255 (31%) in the major LARS group and 184/222 (83%) no/minor LARS group. The sensitivity was 31% for major LARS and the positive predictive value was 68%.
The POLARS score has a low sensitivity for major LARS in this Swedish cohort. Other methods to predict the risk of LARS need to be developed.
The management of upper gastrointestinal bleeding (UGIB) has seen rapid advancements with revolutionising innovations. However, insufficient data exist on the necessary number of emergency endoscopies needed to achieve competency in haemostatic interventions.
We retrospectively analysed all oesophagogastroduodenoscopies with signs of recent haemorrhage performed between 2015 and 2022 at our university hospital. A learning curve was created by plotting the number of previously performed oesophagogastroduodenoscopies with signs of recent haemorrhage against the treatment failure rate, defined as failed haemostasis, rebleeding and necessary surgical or radiological intervention.
The study population included 787 cases with a median age of 66 years. Active bleeding was detected in 576 cases (73.2%). Treatment failure occurred in 225 (28.6%) cases. The learning curve showed a marked decline in treatment failure rates after nine oesophagogastroduodenoscopies had been performed by the respective endoscopists followed by a first plateau between 20 and 50 procedures. A second decline was observed after 51 emergency procedures followed by a second plateau. Endoscopists with experience of <10 emergency procedures had higher treatment failure rates compared with endoscopists with >51 emergency oesophagogastroduodenoscopies performed (p=0.039) or consultants (p=0.041).
Our data suggest that a minimum number of 20 oesophagogastroduodenoscopies with signs of recent haemorrhage is necessary before endoscopists should be considered proficient to perform emergency procedures independently. Endoscopists might be considered as advanced-qualified experts in managing UGIB after a minimum of 50 haemostatic procedure performed. Implementing recommendations on minimum numbers of emergency endoscopies in education programmes of endoscopy trainees could improve their confidence and competency in managing acute UGIB.
Cholestatic pruritus in primary biliary cholangitis (PBC) reduces patients’ health-related quality of life (HRQoL). Despite this, existing research suggests that pruritus is under-recorded in patients’ health records. This study assessed the extent to which pruritus was recorded in medical records of patients with PBC as compared with patient-reported pruritus, and whether patients reporting mild itch were less likely to have pruritus recorded. We also evaluated clinico-demographic characteristics and HRQoL of patients with medical record-documented and patient-reported pruritus.
This cross-sectional study used clinical information abstracted from medical records, together with patient-reported (PBC-40) data from patients with PBC in the USA enrolled in the PicnicHealth cohort. Medical record-documented pruritus was classified as ‘recent’ (at, or within 12 months prior to, enrolment) or ‘ever’ (at, or any point prior to, enrolment). Patient-reported pruritus (4-week recall) was assessed using the first PBC-40 questionnaire completed on/after enrolment; pruritus severity was classified by itch domain score (any severity: ≥1; clinically significant itch: ≥7). Patient clinico-demographic characteristics and PBC-40 domain scores were described in patients with medical record-documented and patient-reported pruritus; overlap between groups was evaluated. Descriptive statistics were reported.
Pruritus of any severity was self-reported by 200/225 (88.9%) patients enrolled; however, only 88/225 (39.1%) had recent medical record-documented pruritus. Clinically significant pruritus was self-reported by 120/225 (53.3%) patients; of these, 64/120 (53.3%) had recent medical record-documented pruritus. Patients reporting clinically significant pruritus appeared to have higher mean scores across PBC-40 domains (indicating reduced HRQoL), versus patients with no/mild patient-reported pruritus or medical-record documented pruritus.
Compared with patient-reported measures, pruritus in PBC is under-recorded in medical records and is associated with lower HRQoL. Research based only on medical records underestimates the true burden of pruritus, meaning physicians may be unaware of the extent and impact of pruritus, leading to potential undertreatment.
Several characteristics are known to affect the risk of Barrett’s oesophagus (BO) in the general population, with symptomatic gastro-oesophageal reflux disease (GORD) being a critical risk factor. In this study, we examined factors that influence BO development in people living with GORD.
People living with GORD were recruited from an endoscopy unit with lifestyle, medical and prescribing history collected. Logistic regression analysis was undertaken to assess the effects of multiple parameters on the likelihood of developing BO.
1197 participants were recruited. Most were Caucasian (n=1188, 99%), had no formal educational qualifications (n=714; 59.6%) and lived with overweight (mean body mass index >25 kg/m2). Many lived in areas of least socioeconomic resource (n=568; 47.4%). 139 (11.6%) had BO at baseline. In adjusted baseline analysis (n=1197), male sex (adjusted OR, aOR 2.04 (95% CI 1.92 to 4.12), p≤0.001), increasing age (aOR 1.03 (95% CI 1.01 to 1.04), p≤0.0001) and proton pump inhibitor use (aOR 3.03 (95% CI 1.80 to 5.13), p≤0.0001) were associated with higher odds of BO. At follow-up (n=363), 22 (6.1%) participants developed BO; male sex (aOR 3.18 (95% CI 1.28 to 7.86), p=0.012), pack-years cigarettes smoked (aOR 1.04 (95% CI 1.00 to 1.08), p=0.046) and increased alcohol intake (aOR 1.02 (95% CI 1.00 to 1.04), p=0.013), were associated with increased odds of BO.
Male sex, pack-years cigarettes smoked, and increasing alcohol intake, were independently associated with increased odds of developing BO over 20-year follow-up. These results align with research linking male sex and smoking with BO and extend this by implicating the potential role of alcohol in developing BO, which may require communication through public health messaging.
Transjugular intrahepatic portosystemic shunt (TIPS) is a minimally invasive therapeutic option to treat the sequelae of portal hypertension. It is unclear whether current international recommendations are reflected in current clinical practice across Australia and the extent of variations in care. This study aimed to address this gap in knowledge and benchmark the current landscape of TIPS services in Australia against international guidelines.
We designed a 42-item questionnaire according to practice-based recommendations and standards of international guidelines to investigate current landscape of TIPS service across four key domains: (1) service provision, (2) patient selection and indications, (3) best procedure practice, and (4) postoperative care.
Gastroenterologist/hepatologists from 23 major liver centres (67.6%) across Australia currently performing TIPS completed the questionnaire. Between 2017 and 2020, there were 456 elective TIPS insertions. Units offering TIPS service had a low median number of TIPS insertions (n=7 per annum). More than half of respondents (56.5%) did not have institutional clinical practice protocols. There was marked variation in practices across institutions in terms of TIPS indications and patient selection. Despite variations, the success rate of elective TIPS was high at 91.7% (79–100%), with 86.6% (29–100%) for rescue TIPS. There was significant variation in postoperative follow-up and care.
Current TIPS practice in Australia varies significantly across institutions. There is a need for a national consensus clinical practice guidelines to improve access and minimise unwarranted variation. A national registry for TIPS could measure, monitor, and report on quality of clinical care and patient outcomes.
The efficacy of transjugular intrahepatic portosystemic shunt (TIPS) plus extrahepatic collateral embolisation (TIPS+E) in reducing rebleeding and hepatic encephalopathy (HE) post-TIPS was recently reported in a meta-analysis, but further validation is essential. This study aims to confirm the effectiveness of TIPS+E using real-world data.
The multicentre retrospective cohort included 2077 patients with cirrhosis who underwent TIPS±E (TIPS: 631, TIPS+E: 1446) between January 2010 and December 2022. Regression and propensity score matching (PSM) were used to adjust for baseline characteristic differences. After PSM, clinical outcomes, including rebleeding, HE, survival and further decompensation (FDC), were analysed. Baseline data from all patients contributed to the construction of prognostic models.
After PSM, 1136 matched patients (TIPS+E: TIPS=568:568) were included. TIPS+E demonstrated a significant reduction in rebleeding (HR 0.77; 95% CI 0.59 to 0.99; p=0.04), HE (HR 0.82; 95% CI 0.68 to 0.99; p=0.04) and FDC (HR 0.85; 95% CI 0.73 to 0.99; p=0.04), comparing to TIPS. Significantly, TIPS+E also reduced rebleeding, HE and FDC in subgroup of using 8 mm diameter stents and embolising of gastric varices+spontaneous portosystemic shunts (GV+SPSS). However, there were no differences in overall or subgroup survival analysis. Additionally, the random forest models showed higher accuracy and AUROC comparing to other models. Controlling post-TIPS portal pressure gradient (pPPG) within 7 mm Hg<pPPG<8.5 mm Hg improved prognosis, especially in TIPS+E group.
Our real-world data validation confirms the high efficacy of TIPS+E in reducing rebleeding and HE, particularly when using 8 mm diameter stents, embolising GV+SPSS and maintaining an optimal pPPG.
In liver cirrhosis, acute variceal bleeding (AVB) is associated with a 1-year mortality rate of up to 40%. Data on early or pre-emptive transjugular intrahepatic portosystemic stent–shunt (TIPSS) in AVB is inconclusive and may not reflect current management strategies. Randomised controlled trial of EArly transjugular intrahepatiC porTosystemic stent–shunt in AVB (REACT-AVB) aims to investigate the clinical and cost-effectiveness of early TIPSS in patients with cirrhosis and AVB after initial bleeding control.
REACT-AVB is a multicentre, randomised controlled, open-label, superiority, two-arm, parallel-group trial with an internal pilot. The two interventions allocated randomly 1:1 are early TIPSS within 4 days of diagnostic endoscopy or secondary prophylaxis with endoscopic therapy in combination with non-selective beta blockers. Patients aged ≥18 years with cirrhosis and Child-Pugh Score 7–13 presenting with AVB with endoscopic haemostasis are eligible for inclusion. The primary outcome is transplant-free survival at 1 year post randomisation. Secondary endpoints include transplant-free survival at 6 weeks, rebleeding, serious adverse events, other complications of cirrhosis, Child-Pugh and Model For End-Stage Liver Disease (MELD) scores at 6 and 12 months, health-related quality of life, use of healthcare resources, cost-effectiveness and use of cross-over therapies. The sample size is 294 patients over a 4-year recruitment period, across 30 hospitals in the UK.
Research ethics committee of National Health Service has approved REACT-AVB (reference number: 23/WM/0085). The results will be submitted for publication in a peer-reviewed journal. A lay summary will also be emailed or posted to participants before publication.
ISRCTN85274829; protocol version 3.0, 1 July 2023.
Randomised controlled trials (RCTs) of key therapies in inflammatory bowel disease (IBD) are often presented and available as abstracts for significant periods of time prior to full publication, often being employed to make strategic and clinical prescribing decisions. We compared the concordance of prepublication abstract-only reports and their respective full-text manuscripts.
Pairs of full-text manuscripts and their respective prepublication abstract-only reports for the same RCT outcomes, at the same time point of analysis were included. The RCTs were on treatments for IBD with full-text manuscripts published between 2010 and 2023.
We found 77 pairs of full-text manuscripts and their prepublication abstract-only reports. There were significant mismatches in the reporting of stated planned outcomes (65/77 matched, p<0.001) and primary outcomes reported in their results sections (67/77, p<0.001); trial registrations (34/65, p<0.001); the number of randomised participants (49/77, p=0.18); participants reaching end of study (21/71, p<0.001) and primary outcome data (40/73, p<0.001). Authors conclusions matched (75/77, p=0.157). Authors did not provide explicit or implied justifications for the absence or non-concordance for any of the above items.
Abstract-only reports have consistent issues with both limited reporting of key information and significant differences in data when compared with their later full-text publications. These are not related to further recruitment of patients or word count limitations and are never explained. As abstracts are often used in guidelines, reviews and stakeholder decision-making on prescribing, caution in their use is strongly suggested. Further work is needed to enhance minimum reporting standards in abstract-only works and ensure consistency with final published papers.
Bleeding from parastomal varices causes significant morbidity and mortality. Treatment options are limited, particularly in high-risk patients with significant underlying liver disease and other comorbidities. The use of EUS-guided embolisation coils combined with thrombin injection in gastric varices has been shown to be safe and effective. Our institution has applied the same technique to the treatment of parastomal varices.
A retrospective review was performed of 37 procedures on 24 patients to assess efficacy and safety of EUS-guided injection of thrombin, with or without embolisation coils for treatment of bleeding parastomal varices. All patients had been discussed in a multidisciplinary team meeting, and correction of portal hypertension was deemed to be contraindicated. Rebleeding was defined as stomal bleeding that required hospital admission or transfusion.
All patients had significant parastomal bleeding at the time of referral. 100% technical success rate was achieved. 70.8% of patients had no further significant bleeding in the follow-up period (median 26.2 months) following one procedure. 1-year rebleed-free survival was 80.8% following first procedure. 7 patients (29.1%) had repeat procedures. There was no significant difference in rebleed-free survival following repeat procedures. Higher age was associated with higher risk of rebleeding. No major procedure-related complications were identified.
EUS-guided thrombin injection, with or without embolisation coils, is a safe and effective technique for the treatment of bleeding parastomal varices, particularly for patients for whom correction of portal venous hypertension is contraindicated.
Patients experiencing unexplained chronic throat symptoms (UCTS) are frequently referred to gastroenterology and otolaryngology outpatient departments for investigation. Often despite extensive investigations, an identifiable structural abnormality to account for the symptoms is not found. The objective of this article is to provide a concise appraisal of the evidence-base for current approaches to the assessment and management of UCTS, their clinical outcomes, and related healthcare utilisation.
This multidisciplinary review critically examines the current understanding of aetiological theories and pathophysiological drivers in UCTS and summarises the evidence base underpinning various diagnostic and management approaches.
The evidence gathered from the review suggests that single-specialty approaches to UCTS inadequately capture the substantial heterogeneity and pervasive overlaps among clinical features and biopsychosocial factors and suggests a more unified approach is needed.
Drawing on contemporary insights from the gastrointestinal literature for disorders of gut–brain interaction, this article proposes a refreshed interdisciplinary approach characterised by a positive diagnosis framework and patient-centred therapeutic model. The overarching aim of this approach is to improve patient outcomes and foster collaborative research efforts.
Immunoglobulin G4-related disease (IgG4-RD) is a rare immune mediated fibroinflammatory condition. Pancreaticobiliary (PB) and head and neck (HN) are two of the most commonly involved anatomical sites. It has been postulated that PB IgG4-RD and HN IgG4-RD have distinct clinical phenotypes. Whether the optimum treatment regimen or response to therapy differs between them is unknown. We aimed to assess differences between PB and HN IgG4-RD in a cohort of IgG4 disease managed by an IgG4-RD multispecialty team.
We performed a retrospective study of a prospectively maintained multidisciplinary IgG4-RD database to identify patients diagnosed with PB and HN IgG4-RD (based on initial presentation) between 2005 and 2019. The electronic patient records were reviewed. Use of immunosuppressive agents and clinical course was analysed.
60 patients with PB IgG4-RD and 14 with HN IgG4-RD were included in the study. PB IgG4-RD was associated with older age at diagnosis 64 versus 51 years (p<0.001), higher serum IgG4 level as a multiple of upper limit of normal median (IQR) 2 (1–3.75) vs 1 (1–2), (p=0.04) and greater proportion with more than one organ involved 68% vs 33% (p=0.03). HN IgG4-RD was more likely to receive second-line therapy 71% versus 36% (p=0.03). Persistent elevation of serum IgG4 after therapy was more common in PB IgG4-RD 84% versus 43% (p=0.03).
These findings support the contention that PB IgG4-RD and HN IgG4-RD have different clinical profiles and represent distinct subtypes of IgG4-RD.
Despite saving millions of lives through blood transfusion, transfusion-transmissible infections (TTIs) still threaten the lives of patients requiring blood transfusion. Hence, screening blood donors and studying the prevalence of TTIs among blood donors may display the burden of these diseases among our population. The aim of this study was to assess the seroprevalence rates of transfusion transmitted infections among blood donors in Basra, Iraq from 2019 to 2021 as groundwork for providing safe blood transfusion in Iraq.
A cross-sectional study was carried out in the blood banks in Basra, Iraq from 1 January 2019 to 31 December 2021. A total of 197 898 samples were collected and screened for hepatitis B surface antigen (HBsAg), anti-hepatitis B core (HBc), anti-hepatitis C virus (HCV) and syphilis immunologically.
The prevalence rates of seropositive of viral hepatitis for the year 2019, 2020, 2021 were as following: hepatitis B virus (HBV) rates 1.54%, 1.45% and 1.14% with significant declined trend by 26%; anti-HCV rates were 0.14, 0.12 and 0.11% with significant declined trend by 21.4%; and the syphilis rates were 0.38, 0.47, 0.36 with marked declined trend 5.3%, respectively.
Of those donors showed HBV positive, 2503 (1.26%) had positive anti-HBc results, while only 173 (0.0874) showed positive test results for both anti-HBc and HBsAg.
Prevalence rates of viral hepatitis and syphilis showed a steady decline between 2019 and 2021, and these rates were much lower in Basra than in other parts of Iraq and neighbouring countries. The importance of using the anti-HBc test in the screening of blood donors was indicated in this study. These findings would contribute in improving the understanding of TTIs epidemiology and supporting health authorities controlling bloodborne diseases.
Hepatocellular carcinoma (HCC) incidence in the UK trebled between 1997 and 2017. With increasing numbers requiring treatment, understanding the likely impact on healthcare budgets can inform service planning and commissioning. The aim of this analysis was to use existing registry data to describe the direct healthcare costs of current treatments for HCC and estimate the impact on National Health Service (NHS) budgets.
A retrospective data analysis based on the National Cancer Registration and Analysis Service cancer registry informed a decision-analytic model for England comparing patients by cirrhosis compensation status and those on palliative or curative treatment pathways. Potential cost drivers were investigated by undertaking a series of one-way sensitivity analyses.
Between 1 January 2010 and 31 December 2016, 15 684 patients were diagnosed with HCC. The median cost per patient over 2 years was £9065 (IQR: £1965 to £20 491), 66% did not receive active therapy. The cost of HCC treatment for England over 5 years was estimated to be £245 million.
The National Cancer Registration Dataset and linked data sets have enabled a comprehensive analysis of the resource use and costs of secondary and tertiary healthcare for HCC, providing an overview of the economic impact to the NHS England of treating HCC.
To infer potential mechanisms driving disease subtypes among patients with inflammatory bowel disease (IBD), we profiled the transcriptome of purified circulating monocytes and CD4 T-cells.
RNA extracted from purified monocytes and CD4 T-cells derived from the peripheral blood of 125 endoscopically active patients with IBD was sequenced using Illumina HiSeq 4000NGS. We used complementary supervised and unsupervised analytical methods to infer gene expression signatures associated with demographic/clinical features. Expression differences and specificity were validated by comparison with publicly available single cell datasets, tissue-specific expression and meta-analyses. Drug target information, druggability and adverse reaction records were used to prioritise disease subtype-specific therapeutic targets.
Unsupervised/supervised methods identified significant differences in the expression profiles of CD4 T-cells between patients with ileal Crohn’s disease (CD) and ulcerative colitis (UC). Following a pathway-based classification (Area Under Receiver Operating Characteristic - AUROC=86%) between ileal-CD and UC patients, we identified MAPK and FOXO pathways to be downregulated in UC. Coexpression module/regulatory network analysis using systems-biology approaches revealed mediatory core transcription factors. We independently confirmed that a subset of the disease location-associated signature is characterised by T-cell-specific and location-specific expression. Integration of drug-target information resulted in the discovery of several new (BCL6, GPR183, TNFAIP3) and repurposable drug targets (TUBB2A, PRKCQ) for ileal CD as well as novel targets (NAPEPLD, SLC35A1) for UC.
Transcriptomic profiling of circulating CD4 T-cells in patients with IBD demonstrated marked molecular differences between the IBD-spectrum extremities (UC and predominantly ileal CD, sandwiching colonic CD), which could help in prioritising particular drug targets for IBD subtypes.
Acute upper gastrointestinal bleeding (AUGIB) is a common medical emergency, which takes up considerable healthcare resources. However, only approximately 20%–30% of bleeds require urgent haemostatic intervention. Current standard of care is for all patients admitted to hospital to undergo endoscopy within 24 hours for risk stratification, but this is difficult to achieve in practice, invasive and costly.
To develop a novel non-endoscopic risk stratification tool for AUGIB to predict the need for haemostatic intervention by endoscopic, radiological or surgical treatments. We compared this with the Glasgow-Blatchford Score (GBS).
Model development was carried out using a derivation (n=466) and prospectively collected validation cohort (n=404) of patients who were admitted with AUGIB to three large hospitals in London, UK (2015–2020). Univariable and multivariable logistic regression analysis was used to identify variables that were associated with increased or decreased chances of requiring haemostatic intervention. This model was converted into a risk scoring system, the London Haemostat Score (LHS).
The LHS was more accurate at predicting need for haemostatic intervention than the GBS, in the derivation cohort (area under the receiver operating curve (AUROC) 0.82; 95% CI 0.78 to 0.86 vs 0.72; 95% CI 0.67 to 0.77; p<0.001) and validation cohort (AUROC 0.80; 95% CI 0.75 to 0.85 vs 0.72; 95% CI 0.67 to 0.78; p<0.001). At cut-off scores at which LHS and GBS identified patients who required haemostatic intervention with 98% sensitivity, the specificity of the LHS was 41% vs 18% with the GBS (p<0.001). This could translate to 32% of inpatient endoscopies for AUGIB being avoided at a cost of only a 0.5% false negative rate.
The LHS is accurate at predicting the need for haemostatic intervention in AUGIB and could be used to identify a proportion of low-risk patients who can undergo delayed or outpatient endoscopy. Validation in other geographical settings is required before routine clinical use.
Mathematical models have gained traction when estimating cases of foodborne illness. Model structures vary due to differences in data availability. This begs the question as to whether differences in foodborne illness rates internationally are real or due to differences in modelling approaches.
Difficulties in comparing illness rates have come into focus with COVID-19 infection rates being contrasted between countries. Furthermore, with post-EU Exit trade talks ongoing, being able to understand and compare foodborne illness rates internationally is a vital part of risk assessments related to trade in food commodities.
We compared foodborne illness estimates for the United Kingdom (UK) with those from Australia, Canada and the USA. We then undertook sensitivity analysis, by recreating the mathematical models used in each country, to understand the impact of some of the key differences in approach and to enable more like-for-like comparisons.
Published estimates of overall foodborne illness rates in the UK were lower than the other countries. However, when UK estimates were adjusted to a more like-for-like approach to the other countries, differences were smaller and often had overlapping credible intervals. When comparing rates by specific pathogens, there were fewer differences between countries. The few large differences found, such as virus rates in Canada, could at least partly be traced to methodological differences.
Foodborne illness estimation models are country specific, making international comparisons problematic. Some of the disparities in estimated rates between countries can be shown to be attributed to differences in methodology rather than real differences in risk.
Investigation of gastro-oesophageal reflux disease is usually performed off proton pump inhibitors (PPIs). This can exacerbate symptoms, potentially impacting investigation accuracy if patients circumvent the preinvestigation instructions. There are no standard recommendations on how to manage PPI withdrawal. We aimed to assess the impact of structured alginate use on symptom burden.
Participants were already established on ≥4 weeks of PPI therapy and being referred for manometry and 24-hour pH/impedance testing. Preinvestigation instructions involved stopping PPIs and H2 receptor antagonists for 1 week, but antacids and alginates were allowed until the night before. Participants were randomised to follow these standard instructions (control group), or the same instructions with the provision of Gaviscon Advance to be taken four times daily (treatment group). The primary outcome assessed change in Gastro-Oesophageal Reflux Disease Health-Related Quality of Life Score.
Data for 48 patients were available for primary outcome assessment. While patients in the control group had a significant increase in symptoms (median difference 6.5, 95% CI (1 to 7), p=0.04), no change occurred in the treatment arm (median difference -1.5, 95% CI (-2, 3.5), p=0.54). There were no serious adverse events.
Structured alginate use prevents symptom exacerbation during preinvestigation PPI wash-out. These findings are limited to the 1-week wash-out period but can benefit thousands of patients undergoing investigation for gastro-oesophageal reflux each year. Further research is required to assess this effect in other settings, such as sustained PPI deprescription. The trial was funded by Reckitt Benckiser.
EudraCT registration 2019-004561-41
The incidence of acute pancreatitis (AP) is increasing in the UK. Patients with severe AP require a significant amount of resources to support them during their admission. The ability to predict which patients will develop multiorgan dysfunction remains poor leading to a delay in the identification of these patients and a window of opportunity for early intervention is missed. Social deprivation has been linked with increased mortality across surgical specialties. Its role in predicting mortality in patients with AP remains unclear but would allow high-risk patients to be identified early and to focus resources on high-risk populations.
A prospectively collected single-centre database was analysed. English Index of Multiple Deprivation (IMD) was calculated based on postcode. Patients were grouped according to their English IMD quintile. Outcomes measured included all-cause mortality, Intestive care unit (ITU) admission, overall length of stay (LOS) and local pancreatitis-specific complications.
398 patients with AP between 2018 and 2021 were identified. There were significantly more patients with AP in Q1 (IMD 1–2) compared with Q5 (IMD 9–10) (156 vs 38, p<0.001). Patients who were resident in the most deprived areas were significantly younger (52.4 in Q1 vs 65.2 in Q5, p<0.001), and more often smokers (39.1% in Q1 vs 23.7% in Q5, p=0.044) with IHD (95.0% vs 92.1% in Q5, p<0.001). In multivariate modelling, there was no significance difference in pancreatitis-related complications, number of ITU visits, number of organs supported and overall, LOS by IMD quintile.
Although there was a significantly higher number of patients admitted to our unit with AP from the most socially deprived quintiles, there was no correlation between social economic deprivation and mortality following AP.
To estimate the risk of non-Hodgkin’s lymphoma (NHL) and Hodgkin’s lymphoma (HL) in patients with inflammatory bowel disease (IBD).
We undertook a two-country population cohort study with all patients diagnosed with IBD in Norway and Sweden from 1987 and 1993 through 2015 and 2016, respectively, and analysed the risk of NHL and HL. In Sweden, we also analysed prescriptions of thiopurines and anti-tumour necrosis factor (TNF)-α therapy from 2005. We calculated standardised incidence ratios (SIRs) with 95% CIs using the general populations as reference.
Among 131 492 patients with IBD with a medium follow-up of 9.6 years, we identified 369 cases of NHL and 44 cases of HL. The SIR of NHL was 1.3 (95% CI 1.1 to 1.5) in ulcerative colitis and 1.4 (95% CI 1.2 to 1.7) in Crohn’s disease. We found no compelling heterogeneity in analyses stratified by patient characteristics. We found a similar pattern and magnitude of excess risks for HL. At 10 years, cumulative incidence was 0.26% (95% CI 0.23% to 0.30%) and 0.06% (95% CI 0.04% to 0.08%) for NHL and HL, respectively. Higher excess risks were found among patients with NHL with concomitant primary sclerosing cholangitis (SIR 3.4; 95% CI 2.1 to 5.2) and in those prescribed thiopurines alone (SIR 2.8; 95% CI 1.4 to 5.7) or with anti-TNF-α agents (SIR 5.7; 95% CI 2.7 to 11.9).
Patients with IBD have a statistically significant increased risk of malignant lymphomas compared with the general population, but the absolute risk remains low.
While linked to obesity and associated with an increased cardiovascular morbidity, non-alcoholic fatty liver disease (NAFLD) is an often-asymptomatic cause of chronic liver disease in children. Early detection provides opportunity for interventions to curb progression. Childhood obesity is on the rise in low/middle-income countries, but cause-specific mortality data associated with liver disease are scanty. Establishing the prevalence of NAFLD in overweight and obese Kenyan children would guide in public health policies aimed at early screening and intervention.
To investigate prevalence of NAFLD in overweight and obese children aged 6–18 years using liver ultrasonography.
This was a cross-sectional survey. After obtaining informed consent, a questionnaire was administered, and blood pressure (BP) measured. Liver ultrasonography was performed to assess fatty changes. Categorical variables were analysed using frequency and percentages. 2 test and multiple logistic regression model were used to determine relationship between exposure and outcome variables.
Prevalence of NAFLD was 26.2% (27/103, 95% CI=18.0% to 35.8%). There was no association between sex and NAFLD (OR1.13, p=0.82; 95% CI=0.4 to 3.2). Obese children were four times more likely to have NAFLD compared with overweight children (OR=4.52, p=0.02; 95% CI=1.4 to 19.0). About 40.8% (n=41) had elevated BP, but there was no association with NAFLD (OR=2.06; p=0.27; 95% CI=0.6 to 7.6). Older children (13–18 years) were more likely to have NAFLD (OR 4.42; p=0.03; 95% CI=1.2 to 17.9).
Prevalence of NAFLD was high in overweight and obese school children in Nairobi. Further studies are needed to identify modifiable risk factors to arrest progression and prevent sequelae.
Although appendiceal cancer remains a rare gastrointestinal malignancy compared with colorectal cancer, incidence rates of appendiceal cancer have increased in the last two decades. Appendiceal and cecal adenocarcinomas have distinct genomic profiles, but chemotherapy protocols for these malignancies are the same and survival outcomes between them have not been compared extensively. To this end, we conducted a comparative survival analysis of appendiceal and cecal adenocarcinomas.
Using the Surveillance, Epidemiology and End Results (SEER) database, we identified individuals ≥30 years of age with appendiceal or cecal adenocarcinoma from 1975 to 2016. Demographic, clinical and county-level socioeconomic data were extracted using SEER*Stat software. Survival was compared by Mantel-Haenszel log-rank test, and survival curves were generated using the Kaplan-Meier method. Relative HRs for death in the 5-year period following diagnosis were calculated using multivariable Cox regression analysis, adjusted for all other covariates. The significance level was set at p<0.05 for two-tailed tests. Data were analysed using SAS V.9.4 and R software.
We identified 6491 patients with appendiceal adenocarcinoma and 99 387 patients with cecal adenocarcinoma. Multivariable Cox regression analysis demonstrated significantly higher cancer-specific and overall survival in appendiceal adenocarcinoma compared with cecal adenocarcinoma. Male sex, older age, earlier year of diagnosis, black race, single marital status, non-Hispanic ethnicity, and non-mucinous histology were associated with increased mortality rates. In addition, counties with lower percentage of individuals below the poverty line and higher colorectal cancer screening rates had better survival.
This is the first study to show greater survival in appendiceal adenocarcinoma compared with cecal adenocarcinoma. We also highlighted novel associations of county-level socioeconomic factors with increased mortality in appendiceal adenocarcinoma. Future efforts to develop targeted molecular therapies and reduce socioeconomic barriers to diagnosis and treatment are warranted to improve survival.
We assessed whether the bicarbonate-rich mineral water Staatl. Fachingen STILL is superior over conventional mineral water in relieving heartburn.
Multicentre, double-blind, randomised, placebo-controlled trial STOMACH STILL in adult patients with frequent heartburn episodes since ≥6 months and without moderate/severe reflux oesophagitis. Patients drank 1.5 L/day verum or placebo over the course of the day for 6 weeks. Primary endpoint was the percentage of patients with reduction of ≥5 points in the Reflux Disease Questionnaire (RDQ) score for ‘heartburn’. Secondary endpoints included symptom reduction (RDQ), health-related quality of life (HRQOL, Quality of Life in Reflux and Dyspepsia (QOLRAD)), intake of rescue medication and safety/tolerability.
Of 148 randomised patients (verum: n=73, placebo: n=75), 143 completed the trial. Responder rates were 84.72% in the verum and 63.51% in the placebo group (p=0.0035, number needed to treat=5). Symptoms improved under verum compared with placebo for the dimension ‘heartburn’ (p=0.0003) and the RDQ total score (p=0.0050). HRQOL improvements under verum compared with placebo were reported for 3 of 5 QOLRAD domains, that is, ‘food/drink problems’ (p=0.0125), ‘emotional distress’ (p=0.0147) and ‘vitality’ (p=0.0393). Mean intake of rescue medication decreased from 0.73 tablets/day at baseline to 0.47 tablets/day in week 6 in the verum group, whereas in the placebo group it remained constant during the trial. Only three patients had treatment-related adverse events (verum: n=1, placebo: n=2).
STOMACH STILL is the first controlled clinical trial demonstrating superiority of a mineral water over placebo in relieving heartburn, accompanied by an improved HRQOL.
EudraCT 2017-001100-30.
The incidence of colorectal cancer (CRC) in people aged <50 years has been increasing dramatically in the past three decades and such patients are known to face difficulties in diagnosis. The objective of this study was to better understand the diagnostic experiences of patients with CRC and explore age-related differences in the proportion with positive experiences.
A secondary analysis of the English National Cancer Patient Experience Survey (CPES) 2017 was conducted on the responses of patients with CRC, restricted to those likely to have been diagnosed in the preceding 12 months via pathways other than routine screening. Ten diagnosis-related experience questions were identified, with responses to them categorised as positive, negative or uninformative. Age group-related difference in positive experiences were described and ORs estimated, both raw and adjusted for selected characteristics. Sensitivity analysis was performed by weighting survey responses to 2017 cancer registrations by strata defined by age group, sex and cancer site, to assess whether differential response patterns by these characteristics affected the estimated proportion of positive experiences.
The reported experiences of 3889 patients with CRC were analysed. There was a significant linear trend (p<0.0001) for 9 of 10 experience items, with older patients consistently displaying higher rates of positive experiences and patients aged 55–64 showing rates of positive experience intermediate between younger and older age groups. This was unaffected by differences in patient characteristics or CPES response rates.
The highest rates of positive diagnosis-related experiences were reported by patients aged 65–74 or 75 years and older, and this is robust.
In the past 5 years, there have been several advances in the management of inflammatory bowel disease (IBD). We aim for a new guideline to update the most recent guideline published in 2019. We present the prospective operating procedure and technical summary protocol in the manuscript.
‘Grading of Recommendations Assessment, Development and Evaluation’ (GRADE) will be followed in the development of the guideline, approach as laid out in the GRADE handbook, supported by the WHO. The guideline development group is formed by a variety of disciplines, across both primary and secondary care that took part in an online Delphi process and split into key areas. A final consensus list of thematic questions within a ‘patient, intervention, comparison, outcome’ format has been produced and agreed in the final phase of the Delphi process.
There will be a detailed technical evidence review with source data including systematic reviews appraised with AMSATAR 2 tool (Assessment of multiple systematic reviews), randomised controlled trial data that will be judged for risk of bias with the Cochrane tool and observational studies for safety concerns assessed through the Robins-I tool. Based on the available evidence, some of the recommendations will be based on GRADE while others will be best practice statements.
A full Delphi process will be used to make recommendations using online response systems.
This set of procedures has been approved by the Clinical Services and Standards Committee, the British Society of Gastroenterology executive board and aligned with IBD UK standards.
Poor bowel preparation is the leading cause of failed colonoscopies and increases costs significantly. Several, split preparation, 2 day regimens are available and recently, Plenvu, a low-volume preparation which can be given on 1 day has been introduced.
Assess efficacy and tolerability of commonly used purgative regimens including Plenvu.
In this service evaluation, patients undergoing screening colonoscopy at St Mark’s Hospital, London (February 2020–December 2021) were provided Plenvu (1 or 2 days), Moviprep (2 days) or Senna & Citramag (2 days).
Boston Bowel Preparation Scale (BBPS) score, fluid volumes and procedure times were recorded. A patient experience questionnaire evaluated taste, volume acceptability, completion and side effects.
563 patients were invited to participate and 553 included: 218 Moviprep 2 days, 108 Senna & Citramag 2 days, 152 Plenvu 2 days and 75 Plenvu 1 day.
BBPS scores were higher with Plenvu 1 and 2 days vs Senna & Citramag (p=0.003 and 0.002, respectively) and vs Moviprep (p=0.003 and 0.001, respectively). No other significant pairwise BBPS differences and no difference in preparation adequacy was seen between the groups.
Patients rated taste as most pleasant with Senna & Citramag and this achieved significance versus Plenvu 1 day and 2 days (p=0.002 and p<0.001, respectively) and versus Moviprep (p=0.04).
BBPS score was higher for 1 day and 2 days Plenvu versus both Senna & Citramag and Moviprep. Taste was not highly rated for Plenvu but it appears to offer effective cleansing even when given as a same day preparation.
The global pandemic has diverted resources away from management of chronic diseases, including cirrhosis. While there is increasing knowledge on COVID-19 infection in liver cirrhosis, little is described on the impact of the pandemic on decompensated cirrhosis admissions and outcomes, which was the aim of this study.
A single-centre, retrospective study, evaluated decompensated cirrhosis admissions to a tertiary London hepatology and transplantation centre, from October 2018 to February 2021. Patients were included if they had an admission with cirrhosis decompensation defined as new-onset jaundice or ascites, infection, encephalopathy, portal hypertensive bleeding or renal dysfunction.
The average number of admissions stayed constant between the pre-COVID-19 (October 2018–February 2020) and COVID-19 periods (March 2020–February 2021). Patients transferred in from secondary centres had consistently higher severity scores during the COVID-19 period (UK Model for End-Stage Liver Disease 58 vs 54; p=0.007, Model for End-Stage Liver Disease-Sodium 22 vs 18; p=0.006, EF-CLIF Acute Decompensation (AD) score 55.0 vs 51.0; p=0.055). Of those admitted to the intensive care without acute-on-chronic liver failure, there was a significant increase in AD scores during the COVID-19 period (58 vs 48, p=0.009). In addition, there was a trend towards increased hospital readmission rates during the COVID-19 period (29.5% vs 21.5%, p=0.067). When censored at 30 days, early mortality postdischarge was significantly higher during the COVID-19 period (p<0.001) with a median time to death of 35 days compared with 62 days pre-COVID-19.
This study provides a unique perspective on the impact that the global pandemic had on decompensated cirrhosis admissions. The findings of increased early mortality and readmissions, and higher AD scores on ICU admission, highlight the need to maintain resourcing for high-level hepatology care and follow-up, in spite of other disease pressures.
The incidence of alcohol-associated liver disease (ALD) is increasing, and weight loss surgery is more common due to the obesity epidemic. Roux-en-Y gastric bypass (RYGB) is associated with alcohol use disorder and ALD; however, its impact on outcomes in patients hospitalised for alcohol-associated hepatitis (AH) is unclear.
We performed a single-centre, retrospective study of patients with AH from June 2011 to December 2019. Primary exposure was the presence of RYGB. The primary outcome was inpatient mortality. Secondary outcomes included overall mortality, readmissions and cirrhosis progression.
2634 patients with AH met the inclusion criteria; 153 patients had RYGB. Median age of the entire cohort was 47.3 years; median Model for End Stage Liver Disease - Sodium (MELD-Na) was 15.1 in the study group versus 10.9 in the control group. There was no difference in inpatient mortality between the two groups. On logistic regression, increased age, elevated body mass index, MELD-Na >20 and haemodialysis were all associated with higher inpatient mortality. RYGB status was associated with increased 30-day readmission (20.3% vs 11.7%, p<0.01), development of cirrhosis (37.5% vs 20.9%, p<0.01) and overall mortality (31.4% vs 24%, p=0.03).
Patients with RYGB have higher rates of readmissions, cirrhosis and overall mortality after discharge from hospital for AH. Allocation of additional resources on discharge may improve clinical outcomes and reduce healthcare expenditure in this unique patient population.
Liver transplantation is a proven management method for end-stage cirrhosis and is estimated to have increased life expectancy by 15 years. The COVID-19 pandemic posed a challenge to patients who were candid for a solid-organ transplant. It has been suggested that the outcomes of liver transplants could be adversely affected by the infection, as immunosuppression makes liver transplant candidates more susceptible to adverse effects while predisposing them to higher thrombotic events.
In this retrospective study, the cases who received liver transplants from January 2018 to March 2022 were assessed regarding early postoperative mortality rate and hepatic artery thrombosis (HAT) with COVID-19 infection. This study included 614 cases, of which 48 patients were infected.
This study shows that the early COVID-19-related early postoperative mortality rates substantially increased in the elective setting (OR: 2.697), but the results for the acute liver failure were insignificant. The average model for end-stage liver disease score increased significantly during the pandemic due to new regulations. Although mortality rates increased during the pandemic, the data for the vaccination period show that mortality rates have equalised with the prepandemic era. Meanwhile, COVID-19 infection is assumed to have increased HAT by 1.6 times in the elective setting.
This study shows that COVID-19 infection in an acute liver failure poses comparatively little risk; hence transplantation should be considered in such cases. Meanwhile, the hypercoagulative state induced by the infection predisposes this group of patients to higher HAT rates.
Foreign body ingestion (FBI) occurs infrequently but can be associated with rare risks including perforation. There is limited understanding of the impact of adult FBI in Australia. We aim to evaluate patient characteristics, outcomes and hospital costs of FBI.
A retrospective cohort study of patients with FBI was performed at a non-prison referral centre in Melbourne, Australia. International Classification of Disease-10 coding identified patients with gastrointestinal FBI over financial years 2018–2021. Exclusion criteria were food bolus, medication foreign body, object in anus or rectum, or non-ingestion. Criteria for ‘emergent’ classification were oesophagus, size >6 cm, disc batteries, airway compromise, peritonitis, sepsis and/or suspected viscus perforation.
Thirty-two admissions attributed to 26 patients were included. The median age was 36 years (IQR: 27–56), 58% were male and 35% had a prior psychiatric or autism spectrum disorder. There were no deaths, perforations or surgery. Gastroscopy was performed in 16 admissions and 1 was scheduled following discharge. Rat-tooth forceps were used in 31% and an overtube was used in 3 cases. The median time from presentation to gastroscopy was 673 minutes (IQR: 380–1013). Management was adherent to European Society of Gastrointestinal Endoscopy guidelines in 81%. After excluding admissions with FBI as a secondary diagnosis, median admission cost was $A1989 (IQR: $A643–$A4976) and total admission costs over the 3 years was $A84 448.
FBI in an Australian, non-prison referral centre is infrequent, can often be safely managed expectantly, and has limited impact on healthcare utilisation. Early, outpatient endoscopy could be considered for non-urgent cases, which may reduce costs while maintaining safety.
Barrett’s oesophagus (BO) is a precursor lesion, via dysplastic phases, to oesophageal adenocarcinoma. Although overall risk from BO is low, it has been shown to adversely affect health-related quality of life (HRQOL). The aim was to compare dysplastic BO patients’ HRQOL pre-endoscopic therapy (pre-ET) and post-ET. The pre-ET BO group was also compared with other cohorts: non-dysplastic BO (NDBO), those with colonic polyps, gastro-oesophageal reflux disease (GORD) and healthy volunteers.
Participants in the pre-ET cohort were recruited prior to their endotherapy and HRQOL questionnaires provided pre-ET and post-ET. Wilcoxon rank test was used to compare the pre-ET and post-ET findings. The Pre-ET group was compared to the other cohorts’ HRQOL results using multiple linear regression analysis.
Pre-ET group of 69 participants returned the questionnaires prior to and 42 post-ET. Both the pre-ET and post-ET group showed similar levels of cancer worry, despite the treatment. No statistical significance was found for symptoms scores, anxiety and depression or general health measures with the Short Form-36 (SF-36) Score. Education for the BO patients was overall lacking with many of the pre-ET group still reporting unanswered questions about their disease.
The Pre-ET group was compared with NDBO group (N=379), GORD (N=132), colonic polyp patients (N=152) and healthy volunteers (N=48). Cancer worry was similar between the NDBO group and the Pre-ET group, despite their lower risk of progression. GORD patients had worse symptom scores from a reflux and heartburn perspective. Only the healthy group has significantly better scores in the SF-36 and improved hospital anxiety and depression scores.
These findings suggest that there is a need to improve HRQOL for patients with BO. This should include improved education and devising-specific patient-reported outcome measures for BO to capture relevant areas of HRQOL in future studies.
Undiagnosed fatty liver disease is prevalent in the community, due to high rates of harmful alcohol consumption and/or obesity. Fatty liver disease can progress to cirrhosis and its complications. Early identification of liver disease and treatment may prevent progression to cirrhosis. Biomarkers including FIB-4, enhanced liver fibrosis (ELF), PRO-C3 and vibration controlled transient elastography (VCTE) can stage liver fibrosis, but it is not known how well they perform in a primary care population. Moreover, no assessment of long-term prognostic ability of these biomarkers has been conducted in primary care. We aim to evaluate the performance of fibrosis biomarkers in primary care to develop a pathway to detect advanced fibrosis.
This prospective, observational cohort study will recruit 3000 individuals with fatty liver disease risk factors (obesity, type 2 diabetes or hazardous alcohol consumption) at their primary care ‘annual chronic disease review’. Participants will have a ‘liver health check’. Two pathways will be evaluated: (1) all have FIB-4, ELF and VCTE performed, and (2) patients have an initial assessment with FIB-4 and ELF, followed by VCTE in only those with increased FIB-4 and/or ELF. Individuals with suspected significant/advanced liver fibrosis (liver stiffness measurement>8 kPa), will be reviewed in secondary care to confirm their fibrosis stage and institute treatment. The performance of FIB-4, ELF, PRO-C3, VCTE and novel biomarkers alone or in combination for advanced fibrosis/cirrhosis will be evaluated. Participants will be followed longitudinally via their electronic health records to assess long-term clinical outcomes.
Ethical approval was obtained from the London-Chelsea Research Ethics Committee (22/PR/0535; 27 June 2022). Recruitment began on 31 October 2022. Outcomes of this study will be published in peer-reviewed journals and presented at scientific meetings. A lay summary of the results will be available for study participants and will be disseminated widely by LIVErNORTH.
Acute pancreatitis (AP) is an infrequently reported manifestation of leptospirosis. It is more commonly seen in patients with acute respiratory distress syndrome. Despite novel modalities such as extracorporeal membrane oxygenation (ECMO), the mortality rate remains high and whether this is associated with the lung injury caused by the inflammation in AP remains unclear.
A descriptive study was conducted at a tertiary hospital in the Philippines. Primary outcome was defined as the presence or absence of AP. Secondary outcomes were defined as 28-day mortality rate, length of hospital stay, ECMO days, renal replacement therapy (RRT) days, days on mechanical ventilation, presence of local complications of AP and development of nosocomial infections.
A total of 27 patients were included in the study, and 88.89% (n=24) were men. The mean age for all patients was 33.59±10.22 years. Out of the 27 patients, 19 (70.37%) were diagnosed with AP. Among these 19 patients, one (5.26%) had necrotising pancreatitis and two (10.52%) developed local complications of pancreatitis. Six patients (31.58%) died among those who developed AP, while one (12.50%) died among those who did not. The duration of hospital stay, ECMO, RRT, mechanical ventilation and development of nosocomial infections was also higher in the group who presented with AP.
AP is an under-reported complication of leptospirosis. Our study demonstrated a higher mortality and morbidity in patients with leptospirosis who developed AP.
Endoscopic therapy is the recommended primary treatment for most complex colorectal polyps, but high colonic resection rates are reported. The aim of this qualitative study was to understand and compare between specialities, the clinical and non-clinical factors influencing decision making when planning management.
Semi-structured interviews were performed among colonoscopists across the UK. Interviews were conducted virtually and transcribed verbatim. Complex polyps were defined as lesions requiring further management planning rather than those treatable at the time of endoscopy. A thematic analysis was performed. Findings were coded to identify themes and reported narratively.
Twenty colonoscopists were interviewed. Four major themes were identified including gathering information regarding the patient and their polyp, aids to decision making, barriers in achieving optimal management and improving services. Participants advocated endoscopic management where possible. Factors such as younger age, suspicion of malignancy, right colon or difficult polyp location lead towards surgical intervention and were similar between surgical and medical specialties. Availability of expertise, timely endoscopy and challenges in referral pathways were reported barriers to optimal management. Experiences of team decision-making strategies were positive and advocated in improving complex polyp management. Recommendations based on these findings to improve complex polyp management are provided.
The increasing recognition of complex colorectal polyps requires consistency in decision making and access to a full range of treatment options. Colonoscopists advocated the availability of clinical expertise, timely treatment and education in avoiding surgical intervention and providing good patient outcomes. Team decision-making strategies for complex polyps may provide an opportunity to coordinate and improve these issues.
Transplantation in many Asian countries is moulded by socioeconomic, religious, cultural and health indicators. In most Asian countries, the living-related donation is the common most organ donation. Due to the limited deceased organ donation, live donor programmes flourished in many Asian countries. Another apparent reason for this tremendous growth of living-related programmes in Asian countries is their larger serving population. Several centres from Asia, including Pakistan and India from Southeast Asia and Egypt in Middle East Asia, on the one hand, have recently emerged as leading living donor transplant programmes. On the other hand, a few Asian countries, including Iran and China, have established some of the world’s largest deceased donor programmes.
In Pakistan, thousands of patients die from end-stage organ failure annually, seeking organ transplants for survival. The exact statics are not available, but over 50 000 people are estimated to die each year as a result of end-stage organ failure without getting a transplant, about 15 000–18 000 from kidney failure, and 10 000 from liver failure and the National Centre for Health Statistics labelled organ failure as a leading cause of death. Despite all these efforts, the knowledge of organ donation among Pakistani people was determined to be around 60%. In Pakistan, the lack of deceased organ donation programmes and the unwillingness of people to deceased organ donation contributes to an increased demand for living organ donation and patients continue to rely on living donors. We discuss various obstacles to deceased organ donation comprising various challenges that form a unique combination, including religious, economic, social, demographic and political factors.
Conclusion: Every single effort should be made to initiate and establish multiple deceased donor programmes in Pakistan. Developing the deceased donor programmes in the country will be vital to counter the countrywide increasing organ shortage. The mainstay transplant activities like organ procurement and distribution systems need to be adequately developed. It will help achieve national self-sufficiency and decrease living donors’ burden. With education, the behaviour of healthcare professionals and common people can be changed and a positive attitude toward deceased organ donation can be obtained. As healthcare professionals, we should come forward and take responsibility by enrolling ourselves in deceased donors’ registration. Public awareness, medical community interest and government support are essential in initiating and establishing deceased donor programmes in Pakistan.
We aim to compare the real-life direct and indirect costs of switching patients from intravenous to subcutaneous (SC) CT-P13, an infliximab biosimilar, in a tertiary UK Inflammatory Bowel Disease (IBD) centre.
All adult patients with IBD on standard dosing CT-P13 (5 mg/kg 8 weekly) were eligible to switch. Of 169 patients eligible to switch to SC CT-P13, 98 (58%) switched within 3 months and one moved out of area.
Total annual intravenous cost for 168 patients was £689 507.04 (direct=£653 671.20, indirect=£35 835.84). After the switch, as-treated analysis demonstrated total annual cost for 168 patients (70 intravenous and 98 SC) was £674 922.83 (direct = £654 563, indirect = £20 359.83) resulting in £891.80 higher cost to healthcare providers. Intention to treat analysis showed a total annual cost of £665 961.01 (direct = £655 200, indirect = £10 761.01) resulting in £1528.80 higher cost to healthcare providers. However, in each scenario, the significant decrease in indirect costs resulted in lower total costs after switching to SC CT-P13.
Our real-world analysis demonstrates switching from intravenous to SC CT-P13 is broadly cost neutral to healthcare providers. SC preparations have marginally higher direct costs, switching allows for efficient use of intravenous infusion units and reduces costs to patients.
To determine the effectiveness of a mobile application (app) in improving the quality of bowel preparation for colonoscopy.
An endoscopist-blinded randomised controlled trial enrolled patients who were undergoing a colonoscopy on the same day of bowel preparation. The intervention used a Vietnamese mobile app that provides instructions on bowel preparation while patients in the comparison group received conventional instructions. Outcomes included the Boston Bowel Preparation Scale (BBPS) to assess the quality of bowel preparation and the polyp detection rate (PDR) and adenoma detection rate (ADR).
The study recruited 515 patients (256 in the intervention group). The median age was 42 years, 50.9% were females, 69.1% high school graduates and higher, and 45.2% from urban area. Patients in the intervention group had higher adherence to instructions (60.9% vs 52.4%, p=0.05) and longer length of taking laxatives (mean difference 0.17 hours, 95% CI 0.06 to 0.27). The intervention did not reduce the risk of poor bowel cleansing (total BBPS<6) in both overall (7.4% vs 7.7%; risk ratio 0.96, 95% CI 0.53 to 1.76) and subgroup analysis. PDR and ADR were similar between the two groups.
The mobile app providing instructions on proper bowel preparation improved the practice during bowel preparation but did not improve the quality of bowel cleansing or PDR.
To estimate the risk of interval colorectal cancer (CRC) in faecal immunochemical test (FIT) negative screening participants according to socioeconomic status.
In this register-based study, first round FIT negative (<20 µg hb/g faeces) screening participants (biennial FIT, citizens aged 50–74) were followed to estimate interval CRC risk. Multivariate Cox proportional hazard regression models estimated HRs based on socioeconomic status defined by educational level and income. Models were adjusted for age, sex and FIT concentration.
We identified 829 (0.7) interval CRC in 1 160 902 individuals. Interval CRC was more common in lower socioeconomic strata with 0.7 for medium-long higher education compared with 1.0 for elementary school and 0.4 in the highest income quartile compared with 1.2 in the lowest. These differences did not translate into significant differences in HR in the multivariate analysis, as they were explained by FIT concentration and age. HR for interval CRC was 7.09 (95% CI) for FIT concentrations 11.9–19.8 µg hb/g faeces, and 3.37 (95% CI) for FIT between 7.2 and 11.8 compared with those <7.2. The HR rose with increasing age ranging from 2.06 (95% CI 1.45 to 2.93) to 7.60 (95% CI 5.63 to 10.25) compared with those under 55 years.
Interval CRC risk increased with decreasing income, heavily influenced by lower income individuals more often being older and having increased FIT concentrations. Individualising screening interval based on age and FIT result, may decrease interval CRC rates, reduce the social gradient and thereby increase the screening efficiency.
Mirikizumab, a monoclonal antibody targeting the interleukin-23 p19 subunit, was effective in a Phase 2 study (NCT02589665) of moderately-to-severely active ulcerative colitis (UC). We studied mirikizumab’s impact on health-related quality of life (HRQoL).
HRQoL was evaluated using the Inflammatory Bowel Disease Questionnaire (IBDQ) and 36-Item Short Form Health Survey (SF-36) Physical Component Score (PCS) and Mental Component Score (MCS). Mixed effects models for repeated measures compared score changes between mirikizumab and placebo groups. Additional analyses evaluated associations between HRQoL score changes and achievement of efficacy endpoints at weeks 12 and 52.
At week 12, IBDQ improved compared with placebo for all mirikizumab groups except mirikizumab 50 mg (50 mg, p=0.073; 200 mg, p<0.001; 600 mg, p<0.001). SF-36 PCS was significantly higher in all mirikizumab groups at week 12 (50 mg, p=0.011; 200 mg, p=0.022; 600 mg, p=0.002); MCS was significantly higher in mirikizumab 200 and 600 mg groups compared with placebo (50 mg, p=0.429; 200 mg, p=0.028; 600 mg, p<0.001). Achievement of clinical response and remission were associated with greater HRQoL improvements at week 12. Improvements in HRQoL scores were sustained through week 52. Of the clinical symptoms evaluated, reduction in rectal bleeding was associated with greater improvements in IBDQ and SF-36 scores.
Mirikizumab improved HRQoL in patients with moderately-to-severely active UC.
De novo percutaneous placement of radiologically inserted low-profile or ‘button-type’ gastrostomy catheters (LPG) is infrequently reported in adults. This study compares the safety and clinical outcomes of primary percutaneous placement of LPG catheters and traditional balloon-retention gastrostomy catheters (TG) using image guidance at a single institution.
This was a retrospective, single-institution review comparing initial LPG and TG radiologically inserted catheter placements in a 36-month time period. The age, gender, indication, catheter type and method of anaesthesia of 139 consecutive initial gastrostomy placement procedures were recorded. Total catheter days without intervention, major and minor complications, reasons for reintervention, and procedure fluoroscopy times were compared.
During the 36-month study period, 61 LPG and 78 TG catheters were placed. Mean total catheter days prior to intervention was 137 days in the LPG group and 128 days in the TG group (p=0.70). Minor complications including cellulitis, pericatheter leakage and early catheter occlusion occurred in 4.9% (3/61) in the LPG group and 9% (7/78) in the TG group (p=0.5). Major complications including early catheter dislodgement and bleeding requiring transfusion (in one patient) occurred in 4.9% (3/61) in the LPG group and 7.7% (6/78) in the TG group (p=0.4). Procedure fluoroscopy time was lower in the LPG group (2.56 min) compared with the TG group (4.21 min) (p<0.005).
Primary placement of low-profile or ‘button-type’ gastrostomy catheters is technically feasible with a low complication rate similar to that of traditional radiologically inserted gastrostomy catheters.
Ascites in patients with decompensated cirrhosis can lead to abdominal distention and decrease quality of life. Tolvaptan, a vasopressin V2 receptor antagonist, is an effective agent in the treatment of ascites, whereas some patients are refractory to tolvaptan. The efficacy of transjugular intrahepatic portosystemic shunt (TIPS) for these patients is not known. In this study, we performed TIPS for tolvaptan-refractory cirrhotic patients and analysed its efficacy and safety in these patients.
This retrospective analysis included patients with liver cirrhosis who received TIPS for ascites or hydrothorax refractory to tolvaptan therapy along with conventional diuretics between January 2015 and May 2018 at Tokai University Hospital. We evaluated the efficacy and safety of TIPS.
This study included four patients. All patients presented with Child-Pugh class B liver cirrhosis and model for end-stage liver disease-sodium scores were 10/12/14/16. TIPS was generated successfully without any major complications in all patients. The body weight decreased by a mean of 4.7 (SD=1.0) kg and estimated glomerular filtration rate improved from a mean of 38.2 (SD=10.3) to 59.5 (SD=25.0) mL/min/1.73 m2 in a month after TIPS procedure.
TIPS is an effective potential treatment for ascites in patients with tolvaptan refractory condition. In appropriate patients who can tolerate TIPS, the treatment may lead towards renal function improvement.
Hepatic damage is one of the common forms of extra pulmonary organ destructions among patients with COVID-19 infections.
To evaluate the prognosis of liver damage among COVID-19 patients based on their liver enzymes profile.
A retrospective study was done to evaluate the records of the hospitably admitted patient due to COVID-19 infection.
Retrieved data included clinical presentation and investigation either imaging or laboratory with special investing in liver function tests.
We reviewed 442 patients who were diagnosed with COVID-19 infection.
They were 64.5% of female patients and 35.5% of male patients. Their mean age was 54.5%, most of them were Saudi (76.7%) and the overall mortality reached up to (20.4%).
This large cohort of 442 patients has shown that liver damage may be an independent prognostic factor for morbidities and mortality among COVID-19 patients. It also showed the importance of liver function enzymes screening as a predictor for the outcome of those patients.
Non-pharmacological interventions to improve patient-reported outcomes of colonoscopy may be effective at mitigating negative experiences and perceptions of the procedure, but research to characterise the extent and features of studies of these interventions is limited.
We conducted a scoping review searching multiple databases for peer-reviewed publications of randomised controlled trials conducted in adults investigating a non-pharmacological intervention to improve patient-reported outcomes of colonoscopy. Study characteristics were tabulated and summarised narratively and graphically.
We screened 5939 citations and 962 full texts, and included 245 publications from 39 countries published between 1992 and 2022. Of these, 80.8% were full publications and 19.2% were abstracts. Of the 41.9% of studies reporting funding sources, 11.4% were unfunded. The most common interventions were carbon dioxide and/or water insufflation methods (33.9%), complementary and alternative medicines (eg, acupuncture) (20.0%), and colonoscope technology (eg, magnetic scope guide) (21.6%). Pain was as an outcome across 82.0% of studies. Studies most often used a patient-reported outcome examining patient experience during the procedure (60.0%), but 42.9% of studies included an outcome without specifying the time that the patient experienced the outcome. Most intraprocedural patient-reported outcomes were measured retrospectively rather than contemporaneously, although studies varied in terms of when outcomes were assessed.
Research on non-pharmacological interventions to improve patient-reported outcomes of colonoscopy is unevenly distributed across types of intervention and features high variation in study design and reporting, in particular around outcomes. Future research efforts into non-pharmacological interventions to improve patient-reported outcomes of colonoscopy should be directed at underinvestigated interventions and developing consensus-based guidelines for study design, with particular attention to how and when outcomes are experienced and measured.
42020173906.
Screening for early oesophageal adenocarcinoma (OAC), including its precursor Barrett’s oesophagus (BO), can potentially reduce OAC-related morbidity and mortality. This study explores Dutch at-risk individuals’ views of screening an at-risk population for BO/OAC.
We invited 372 individuals with risk factors for OAC from primary care practices, 73 individuals with surveillance experience, and 221 participants of previous studies (BO/OAC screening trial or survey) to participate in focus groups. Transcripts were inductively and thematically analysed by two independent researchers.
A total of 50 individuals (42% with gastro-oesophageal reflux symptoms) of 50–75 years participated. Themes that were raised included: theme 1 ‘screening intentions’ describing participants’ motivation to be screened (eg, early diagnosis, potential reassurance, physician recommendation, and knowing someone with cancer) or decline screening (eg, anticipated discomfort or suboptimal accuracy of the test); theme 2 ‘risk-based eligibility’ describing the tension between effectiveness (eg, targeting high-risk individuals) and inclusivity (eg, making screening available for everyone); theme 3 ‘distributive justice’, in which the pressure of a potential new screening programme on healthcare resources was discussed; and theme 4 ‘information needs’ describing the perceived lack of information access and individuals’ preference to discuss screening with their general practitioner.
Individuals not only expressed high willingness to be screened but also voiced the concern that a new screening programme may pressure limited healthcare resources. If implemented, it is crucial to develop educational materials that meet the public’s information needs and explain the test procedures and eligibility criteria while avoiding stigmatising language.
Refractory ulcerative proctitis presents a huge clinical challenge not only for the patients living with this chronic, progressive condition but also for the professionals who care for them. Currently, there is limited research and evidence-based guidance, resulting in many patients living with the symptomatic burden of disease and reduced quality of life. The aim of this study was to establish a consensus on the thoughts and opinions related to refractory proctitis disease burden and best practice for management.
A three-round Delphi consensus survey was conducted among patients living with refractory proctitis and the healthcare experts with knowledge on this disease from the UK. A brainstorming stage involving a focus group where the participants came up with an initial list of statements was completed. Following this, there were three rounds of Delphi surveys in which the participants were asked to rank the importance of the statements and provide any additional comments or clarifications. Calculation of mean scores, analysis of comments and revisions were performed to produce a final list of statements.
In total, 14 statements were suggested by the focus group at the initial brainstorming stage. Following completion of three Delphi survey rounds, all 14 statements reached consensus following appropriate revision.
We established consensus on the thoughts and opinions related to refractory proctitis from both the experts who manage this disease and the patients living with it. This represents the first step towards developing clinical research data and ultimately the evidence needed for best practice management guidance of this condition.
Forty distinct primary sclerosing cholangitis (PSC) genomic loci have been identified through multiancestry meta-analyses. The polygenic risk score (PRS) could serve as a promising tool to discover unique disease behaviour, like PSC, underlying inflammatory bowel disease (IBD).
To test whether PRS indicates PSC risk in patients with IBD.
Mayo Clinic and Washington University at St Louis IBD cohorts were used to test our hypothesis. PRS was modelled through the published PSC loci and weighted with their corresponding effect size. Logistic regression was applied to predict the PSC risk.
In total, 63 (5.6%) among 1130 patients with IBD of European ancestry had PSC. Among 381 ulcerative colitis (UC), 12% had PSC; in contrast to 1.4% in 761 Crohn disease (CD). Compared with IBD alone, IBD-PSC had significantly higher PRS (PSC risk: 3.0% at the lowest PRS quartile vs 7.2% at the highest PRS quartile, Ptrend =.03). In IBD subphenotypes subgroup analysis, multivariate analysis shows that UC-PSC is associated with more extensive UC disease (OR, 5.60; p=0.002) and younger age at diagnosis (p=0.02). In CD, multivariate analysis suggests that CD-PSC is associated with colorectal cancer (OR, 50; p=0.005).
We found evidence that patients with IBD with PSC presented with a clinical course difference from that of patients with IBD alone. PRS can influence PSC risk in patients with IBD. Once validated in an independent cohort, this may help identify patients with the highest likelihood of developing PSC.
The documented variation in gastric cancer (GC) detection among endoscopists has often been dismissed as a coincidental artefact of the low incidence of gastric neoplasms; it is not considered associated with differences in physicians’ performance of the esophagogastroduodenoscopy procedure. This study is to confirm whether significant variations among endoscopists in early GC detection suggest the individual performance of the upper endoscopy.
A retrospective observational study at a single centre in Japan assessed the results of 218 early GCs detected during 25 688 routine esophagogastroduodenoscopies by 12 endoscopists. The main outcome was the rate of early GC detection for each endoscopist under the same circumstances. Other measures included the major diameters and locations of the lesions, Helicobacter pylori infection status, and baseline patient characteristics that could affect the prevalence of GC.
The early GC detection rates exhibited wide variation among endoscopists (0.09%–2.87%) despite performing routine esophagogastroduodenoscopies in a population with a similar background. Endoscopists were assigned to a low-detection group (n=6; detection rate: 0.47% (range: 0.09%–0.55%)) and a high-detection group (n=5; detection rate: 0.83% (range: 0.63%–1.12%)), with the single highest detector analysed separately due to his distinct detection rate (2.87%). Endoscopists in the high-detection group had better detection rates for minute (major diameter ≤5 mm) and small (major diameter 6–10 mm) GCs than the low-detection group (0.19%/0.23% vs 0.085%/0.098%). These differences were significant (p<0.01), although there were no significant differences in detection of larger tumours (major diameter ≥11 mm; 0.40% vs 0.28%; p=0.13). The tumour location and H. pylori status were similar in the low-detection group, high-detection group and for the highest detector.
Significant variation in the detection of hard-to-find, smaller GCs may reflect individual performance of the examination.
Stellate cells are responsible for liver and pancreas fibrosis and strictly correlate with tumourigenesis. Although their activation is reversible, an exacerbated signalling triggers chronic fibrosis. Toll-like receptors (TLRs) modulate stellate cells transition. TLR5 transduces the signal deriving by the binding to bacterial flagellin from invading mobile bacteria.
Human hepatic and pancreatic stellate cells were activated by the administration of transforming growth factor-beta (TGF-β). TLR5 was transiently knocked down by short-interference RNA transfection. Reverse Transcription-quantitativePCR and western blot were performed to analyse the transcript and protein level of TLR5 and the transition players. Fluorescence microscopy was performed to identify these targets in spheroids and in the sections of murine fibrotic liver.
TGF-β-activated human hepatic and pancreatic stellate cells showed an increase of TLR5 expression. TLR5 knockdown blocked the activation of those stellate cells. Furthermore, TLR5 busted during murine liver fibrosis and co-localised with the inducible Collagen I. Flagellin suppressed TLR5, COL1A1 and ACTA2 expression after the administration of TGF-β. Instead, the antagonist of TLR5 did not block the effect of TGF-β. Wortmannin, a specific AKT inhibitor, induced TLR5 but not COL1A1 and ACTA2 transcript and protein level.
TGF-β-mediated activation of hepatic and pancreatic stellate cells requires the over-expression of TLR5. Instead, its autonomous signalling inhibits the activation of the stellate cells, thus prompting a signalling through different regulatory pathways.
Combination therapy with infliximab and a thiopurine has been shown to be more effective than monotherapy in patients with inflammatory bowel disease (IBD). The therapeutic efficacy of thiopurines is correlated with 6-thioguanine (6-TGN) levels between 235 and 450 pmol/8x108 erythrocytes. The primary aim of the study was to investigate the association between 6-TGN levels and inhibition prevention of the production of antibodies to infliximab (ATI).
We performed a retrospective review of the medical records of patients being treated with infliximab for IBD at University Hospitals Bristol NHS Foundation Trust. Demographic and biochemical data were extracted, alongside thiopurine metabolite levels, trough levels of infliximab and the presence of ATI. 2 tests were used to investigate the association between 6-TGN levels and prevention of ATI. Logistic regression was used to compare the odds of prevented ATI between those with a 6-TGN level between 235 and 450 pmol/8x108 erythrocytes, those with a 6-TGN level outside of this range, and the baseline group who were on infliximab monotherapy.
Data were extracted for 100 patients. Six of 32 patients with a 6-TGN level between 235 and 450 pmol/8x108 erythrocytes developed ATI (18.8%) compared with 14 out of 22 (63.6%) patients with a 6-TGN outside of this range and 32 out of 46 (69.6%) patients on monotherapy (p=0.001). The OR (95% CI) for prevented ATI in those with a 6-TGN between 235 and 450 pmol/8x108 erythrocytes compared with a 6-TGN outside of this range was 7.6 (2.2, 26.3) (p=0.001) and compared with monotherapy was 9.9 (3.3, 29.4) (p=0.001).
6-TGN levels between 235 and 450 pmol/8x108 erythrocytes prevented production of ATI. This supports therapeutic drug monitoring to help guide treatment and maximise the beneficial effects of combination therapy for patients with IBD.
Haemorrhoids are one of the most common gastrointestinal and anal diseases. In olive oil and honey propolis, flavonoids have beneficial effects on improving vascular function and decreasing vascular resistance. In this study, we aimed to produce a combination of these two substances in the form of lotions and assess their healing and side effects in comparison with routine treatment, anti-haemorrhoid ointment (containing hydrocortisone and lidocaine).
In this randomised clinical trial study, 86 patients with grade 2 or more haemorrhoid degrees, diagnosed by colonoscopy, were divided into two groups, the case (n=44) and control (n=42). The case group was treated with flavonoid lotion, and the control group was treated with anti-haemorrhoid ointment two times per day for 1 month. Patients were followed weekly with history and physical examination. The data of the two groups were collected before and after the intervention and statistically analysed.
Post-treatment reduction in haemorrhoid grade was significant in the case group (p=0.02). This ratio was insignificant in the control group (p=0.139). Flavonoid lotion (p<0.05) significantly reduced the signs and symptoms of haemorrhoids more than anti-haemorrhoid ointment.
According to the results, flavonoid lotion can be an excellent alternative to topical chemical drugs, such as anti-haemorrhoid ointment, in treating haemorrhoid disease. Besides its effectiveness and safety, it can be easily manufactured and widely available to patien
To evaluate the impact of British Society of Gastroenterology/Association of Coloproctology of Great Britain and Ireland/Public Health England (BSG/ACPGBI/PHE) 2019 polypectomy surveillance guidelines within a national faecal immunochemical test-based bowel cancer screening (BS) cohort on surveillance activity and detection of pathology by retrospective virtual application.
A retrospective review of BS colonoscopies performed in 2015–2016 with 5 years prospective follow-up in single institution. Index colonoscopies were selected. Incomplete colonoscopies were excluded. Histology of all resected polyps was reviewed. Surveillance intervals were calculated according to BSG/ACPGBI/PHE 2019 guidelines and compared with pre-existing ‘European Guidelines for Quality Assurance in Colorectal Cancer Screening and Diagnosis’ (EUQA 2013). Total number of colonoscopies deferred by virtual implementation of BSG/ACPGBI/PHE 2019 guidelines were calculated. Pathology identified on procedures that would have been deferred was reviewed.
Total number of index BS colonoscopies performed in 2015–2016 inclusive was 890. 115 were excluded (22 no caecal intubation, 51 inadequate bowel preparation, 56 incomplete polyp clearance). N=509 colonoscopies were scheduled within a 5-year interval following index colonoscopy surveillance rounds based on EUQA guidelines. Overall, volume of surveillance was significantly reduced with retrospective application of BSG/ACPGBI/PHE 2019 guidelines (n=221, p<0.0001). No cancers were detected within the ‘potentially deferred’ procedures who attended for follow-up (n=330) with high-risk findings found in<10% (n=30) of colonoscopies within the BSG/ACPGBI/PHE cohort.
BSG/ACPGBI/PHE 2019 guidelines safely reduce the burden of colonoscopy demand with acceptable pathology findings on deferred colonoscopies.
For acute cholecystitis, the treatment of choice is laparoscopic cholecystectomy. In mild-to-moderate cases, the use of antibiotic prophylaxis for the prevention of postoperative infectious complications (POICs) lacks evidence regarding its cost-effectiveness when compared with no prophylaxis. In the context of rising antimicrobial resistance, there is a clear rationale for a cost-effectiveness analysis (CEA) to determine the most efficient use of National Health Service resources and antibiotic routine usage.
16 of 226 patients (7.1%) in the single-dose prophylaxis group and 29 of 231 (12.6%) in the non-prophylaxis group developed POICs. A CEA was carried out using health outcome data from thePerioperative antibiotic prophylaxis in the treatment of acute cholecystitis (PEANUTS II) multicentre, randomised, open-label, non-inferiority, clinical trial. Costs were measured in monetary units using pound sterling, and effectiveness expressed as POICs avoided within the first 30 days after cholecystectomy.
This CEA produced an incremental cost-effectiveness ratio of –£792.70. This suggests a modest cost-effectiveness of antibiotic prophylaxis being marginally less costly and more effective than no prophylaxis. Three sensitivity analyses were executed considering full adherence to the antibiotic, POICs with increased complexity and break-point analysis suggesting caution in the recommendation of systematic use of antibiotic prophylaxis for the prevention of POICs.
The results of this CEA point to greater consensus in UK-based guidelines surrounding the provision of antibiotic prophylaxis for mild-to-moderate cases of acute cholecystitis.
Patients with cystic fibrosis (pwCF) have a high incidence of early colorectal cancer (CRC). In the absence of a UK CRC screening programme for pwCF, we evaluated the utility and outcomes of colonoscopy and CRC at a large UK CF centre.
In a retrospective study of colonoscopy and CRC outcomes between 2010 and 2020 in pwCF aged≥30 years at a large CF centre, data were collected on colonoscopy indications and findings, polyp detection rates, bowel preparation regimens and outcomes, colonoscopy completion rates, and patient outcomes.
We identified 361 pwCF aged ≥30 years, of whom 135 were ≥40 years old. In the absence of a UK CRC screening guideline only 33 (9%)/361 pwCF aged ≥30 years (mean age: 44.8±11.0 years) had a colonoscopy between 2010 and 2020. Colonoscopy completion rate was 94.9%, with a 33% polyp detection rate, 93.8% of the polyps retrieved were premalignant. During the study period no patients developed postcolonoscopy CRC. However, of the patients aged ≥40 years who did not have a colonoscopy (111/135, 82.2%), four (3.6%) patients developed CRC and three pwCF died from complications of CRC.
In this 10-year experience from a large CF centre, colonoscopy uptake for symptomatic indications was low, yet of high yield for premalignant lesions in pwCF >40 years. These data highlight the risk of potentially preventable, early CRC, and therefore support the need for prospective, large-scale nationwide studies which may inform the need for UK CRC screening guidelines for pwCF.
Alveolar echinococcosis (AE) is a parasitic liver disease with infiltrative growth similar to solid organ malignancies. Major vascular damage is frequent and often remains untreated until catastrophic events precipitate. Detailed clinical and radiological assessment is required to guide individualised treatment decisions. Standardised radiological reporting templates of malignancies with profiles resembling AE are candidates for adaptation. Our objectives are to describe vascular pathology in AE and establish a framework for structured evaluation as the basis for treatment decisions and monitoring.
Retrospective case series.
69 patients (37.1%) had vascular involvement: portal vein (PV) 24.7%, hepatic vein (HV) 22.6% inferior vena cava (IVC) 13.4%. Significant stenosis/occlusion of vessels was present in 15.1% of PV, in 13.4% of HV and in 7.5% of IVC involvement. Vascular pathology needing specific treatment or monitoring was present in 8.6% of patients. The most frequent clinical presentation was high grade IVC stenosis or occlusion which was seen in 11 patients of the cohort.
Advanced AE requires early multidisciplinary assessment to prevent progressive impairment of liver function due to vascular damage. The focus at first presentation is on complete evaluation of vascular (and biliary) involvement. The focus in non-resectable AE is on prevention of vascular (and biliary) complications while suppressing growth of AE lesions by benzimidazole treatment to improve the quality of life of patients. We developed a framework for standardised vascular assessment and follow-up of patients with AE to recognise and treat complications early.
Inflammatory bowel diseases (IBD) are immune-mediated conditions that are increasing in incidence and prevalence worldwide. Their assessment and monitoring are becoming increasingly important, though complex. The best disease control is achieved through tight monitoring of objective inflammatory parameters (such as serum and stool inflammatory markers), cross-sectional imaging and endoscopic assessment. Considering the complexity of the information obtained throughout a patient’s journey, artificial intelligence (AI) provides an ideal adjunct to existing tools to help diagnose, monitor and predict the course of disease of patients with IBD. Therefore, we propose a scoping review assessing AI’s role in diagnosis, monitoring and prognostication tools in patients with IBD. We aim to detect gaps in the literature and address them in future research endeavours.
We will search electronic databases, including Medline, Embase, Cochrane CENTRAL, CINAHL Complete, Web of Science and IEEE Xplore. Two reviewers will independently screen the abstracts and titles first and then perform the full-text review. A third reviewer will resolve any conflict. We will include both observational studies and clinical trials. Study characteristics will be extracted using a data extraction form. The extracted data will be summarised in a tabular format, following the imaging modality theme and the study outcome assessed. The results will have an accompanying narrative review.
Considering the nature of the project, ethical review by an institutional review board is not required. The data will be presented at academic conferences, and the final product will be published in a peer-reviewed journal.
Endoscopic ultrasound-guided through-the-needle microbiopsy (EUS-TTNB) forceps is a recent development that facilitates sampling of the walls of pancreatic cystic lesions (PCL) for histological analysis. We aimed to assess the impact of EUS-TTNB and its influence on patient management in a tertiary pancreas centre.
A prospective database of consecutive patients who underwent EUS-TTNB from March 2020 to August 2022 at a tertiary referral centre was retrospectively analysed.
Thirty-four patients (22 women) were identified. Technical success was achieved in all cases. Adequate specimens for histological diagnosis were obtained in 25 (74%) cases. Overall, EUS-TTNB led to a change in management in 24 (71%) cases. Sixteen (47%) patients were downstaged, with 5 (15%) discharged from surveillance. Eight (24%) were upstaged, with 5 (15%) referred for surgical resection. In the 10 (29%) cases without change in management, 7 (21%) had confirmation of diagnosis with no change in surveillance, and 3 (9%) had insufficient biopsies on EUS-TTNB. Two (6%) patients developed post-procedural pancreatitis, and 1 (3%) developed peri-procedural intracystic bleeding with no subsequent clinical sequelae.
EUS-TTNB permits histological confirmation of the nature of PCL, which can alter management outcomes. Care should be taken in patient selection and appropriately consented due to the adverse event rate.
The association between the severity of COVID-19 and gastrointestinal (GI) bleeding is unknown. This study aimed to determine whether the severity of COVID-19 is a risk factor for GI bleeding.
A multicentre, retrospective cohort study was conducted on hospitalised patients with COVID-19 between January 2020 and December 2021. The severity of COVID-19 was classified according to the National Institute of Health severity classification. The primary outcome was the occurrence of GI bleeding during hospitalisation. The main analysis compared the relationship between the severity of COVID-19 and the occurrence of GI bleeding. Multivariable logistic regression analysis was performed to evaluate the association between the severity of COVID-19 and the occurrence of GI bleeding.
12 044 patients were included. 4165 (34.6%) and 1257 (10.4%) patients had severe and critical COVID-19, respectively, and 55 (0.5%) experienced GI bleeding. Multivariable analysis showed that patients with severe COVID-19 had a significantly higher risk of GI bleeding than patients with non-severe COVID-19 (OR: 3.013, 95% CI: 1.222 to 7.427). Patients with critical COVID-19 also had a significantly higher risk of GI bleeding (OR: 15.632, 95% CI: 6.581 to 37.130). Patients with severe COVID-19 had a significantly increased risk of lower GI bleeding (OR: 10.349, 95% CI: 1.253 to 85.463), but the risk of upper GI bleeding was unchanged (OR: 1.875, 95% CI: 0.658 to 5.342).
The severity of COVID-19 is associated with GI bleeding, and especially lower GI bleeding was associated with the severity of COVID-19. Patients with severe or critical COVID-19 should be treated with caution as they are at higher risk for GI bleeding.
Historical paired liver biopsy studies are likely to underestimate current progression of disease in patients with chronic hepatitis C virus (HCV) infection. We aimed to assess liver disease progression according to the non-invasive Fibrosis-4 (FIB-4) index in patients with chronic HCV and early disease.
Patients diagnosed with chronic HCV and FIB-4 <3.25 from four international liver clinics were included in a retrospective cohort study. Follow-up ended at start of antiviral therapy resulting in sustained virological response, at time of liver transplantation or death. Primary outcome of advanced liver disease was defined as FIB-4 >3.25 during follow-up. Survival analyses were used to assess time to FIB-4 >3.25.
In total, 4286 patients were followed for a median of 5.0 (IQR 1.7–9.4) years, during which 41 071 FIB-4 measurements were collected. At baseline, median age was 47 (IQR 39–55) years, 2529 (59.0%) were male, and 2787 (65.0%) patients had a FIB-4 <1.45. Advanced liver disease developed in 821 patients. Overall, 10-year cumulative incidence of advanced disease was 32.1% (95% CI 29.9% to 34.3%). Patients who developed advanced disease showed an exponential FIB-4 increase. Among patients with a presumed date of HCV infection, cumulative incidence of advanced disease increased 7.7-fold from 20 to 40 years as opposed to the first 20 years after HCV infection.
The rate of advanced liver disease is high among chronic HCV-infected patients with early disease at time of diagnosis, among whom liver disease progression accelerated over time. These results emphasise the need to overcome any limitations with respect to diagnosing and treating all patients with chronic HCV across the globe.
Wilson’s disease (WD) is a copper metabolism disorder characterised by a progressive accumulation of this metal mainly in the liver and the brain. Treatment is based on the removal of copper operated by the chelators, among which, D-penicillamine (DP) is prescribed as a first-line treatment in most situations. There is some evidence in linking the use of DP with a risk of vitamin B6; therefore, vitamin supplementation is sometimes recommended, although non-consensually. The objective of our study was to evaluate the level of vitamin B6 in WD patients treated with DP with and without associated supplementation.
All WD patients followed at the National Reference Centre for WD in Lyon between January 2019 and December 2020 treated with DP for more than 1 year were included and separated in two groups according to vitamin B6 supplementation. The level of vitamin B6 was measured by the determination of pyridoxal phosphate (PLP).
A total of 37 patients were included. Average age of 23.3±14.8 years, 15 patients with <18 years. Median duration of treatment was 51 (55.8) months. 15 patients were under vitamin B6 supplementation and 22 had interrupted it for more than 1 year. The median PLP level was significantly higher in the group with supplementation, 137.2 (86.7) nmol/L vs 64.9 (30.8) nmol/(p<0.01). No patient had a PLP level<35 nmol/L.
Long-term stable WD patients under DP treatment probably do not need vitamin B6 supplementation.
The aim of this study is to investigate whether origins of ethnicity affect the outcomes of surgery for diverticulitis in the USA.
The American College of Surgeons National Surgical Quality Improvement Programme database from 2008 to 2017 was used to identify patients undergoing colectomy for diverticulitis. Patient demographics, comorbidities, procedural details and outcomes were captured and compared by ethnicity status.
A total of 375 311 surgeries for diverticulitis were included in the final analysis. The average age of patients undergoing surgery for diverticulitis remained consistent over the time frame of the study (62 years), although the percentage of younger patients (age 18–39 years) rose slightly from 7.8% in 2008 to 8.6% in 2017. The percentage of surgical patients with Hispanic ethnicity increased from 3.7% in 2008 to 6.6% of patients in 2017. Hispanic patients were younger than their non-Hispanic counterparts (57 years vs 62 years, p<0.01) at time of surgery. There were statistically significant differences in the proportion of laparoscopic cases (51% vs 49%, p<0.01), elective cases (62% vs 66%, p<0.01) and the unadjusted rate of postoperative mortality (2.8% vs 3.4%, p<0.01) between Hispanic patients compared with non-Hispanic patients, respectively. Multivariable logistic regression models did not identify Hispanic ethnicity as a significant predictor for increased morbidity (p=0.13) or mortality (p=0.80).
Despite a significant younger population undergoing surgery for diverticulitis, Hispanic ethnicity was not associated with increased rates of emergent surgery, open surgery or postoperative complications compared with a similar non-Hispanic population.
Alcohol-related liver disease (ALD) is the most common cause of liver-related ill health and liver-related deaths in the UK, and deaths from ALD have doubled in the last decade. The management of ALD requires treatment of both liver disease and alcohol use; this necessitates effective and constructive multidisciplinary working. To support this, we have developed quality standard recommendations for the management of ALD, based on evidence and consensus expert opinion, with the aim of improving patient care.
A multidisciplinary group of experts from the British Association for the Study of the Liver and British Society of Gastroenterology ALD Special Interest Group developed the quality standards, with input from the British Liver Trust and patient representatives.
The standards cover three broad themes: the recognition and diagnosis of people with ALD in primary care and the liver outpatient clinic; the management of acutely decompensated ALD including acute alcohol-related hepatitis and the posthospital care of people with advanced liver disease due to ALD. Draft quality standards were initially developed by smaller working groups and then an anonymous modified Delphi voting process was conducted by the entire group to assess the level of agreement with each statement. Statements were included when agreement was 85% or greater. Twenty-four quality standards were produced from this process which support best practice. From the final list of statements, a smaller number of auditable key performance indicators were selected to allow services to benchmark their practice and an audit tool provided.
It is hoped that services will review their practice against these recommendations and key performance indicators and institute service development where needed to improve the care of patients with ALD.
The COVID-19 pandemic had an undoubted impact on the provision of elective and emergency cancer care, including the diagnosis and management of patients with hepatocellular carcinoma (HCC). Our aim was to determine the effects of the COVID-19 pandemic on patients with HCC in the West of Scotland.
This was a retrospective audit of a prospectively collated database of patients presented to the West of Scotland Multidisciplinary Team (MDT) between April and October 2020 (during the pandemic), comparing baseline demographics, characteristics of disease at presentation, diagnostic workup, treatment and outcomes with patients from April to October 2019 (pre pandemic).
There was a 36.5% reduction in new cases referred to the MDT during the pandemic. Patients presented at a significantly later Barcelona Cancer Liver Clinic stage (24% stage D during the pandemic, 9.5% pre pandemic, p<0.001) and with a significantly higher Child-Pugh Score (46% Child-Pugh B/C during the pandemic vs 27% pre pandemic, p<0.001). We observed a reduction in overall survival (OS) among all patients with a median OS during the pandemic of 6 months versus 17 months pre pandemic (p=0.048).
The impact of the COVID-19 pandemic is likely to have contributed to a reduction in the presentation of new cases and survival among patients with HCC in the West of Scotland. The reason for this is likely multifactorial, but disruption of standard care is likely to have played a significant role. Resources should be provided to address the backlog and ensure there are robust investigation and management pathways going forward.
Evaluate the diagnostic performance of faecal immunochemical test (FIT), identify risk factors for FIT-interval colorectal cancers (FIT-IC) and describe long-term outcomes of participants with colorectal cancers (CRC) in the New Zealand Bowel Screening Pilot (BSP).
From 2012 to 2017, the BSP offered eligible individuals, aged 50–74 years, biennial screening using a quantitative FIT with positivity threshold of 15 µg haemoglobin (Hb)/g faeces. Retrospective review of prospectively maintained data extracted from the BSP Register and New Zealand Cancer Registry identified any CRC reported in participants who returned a definitive FIT result. Further details were obtained from hospital records. FIT-ICs were primary CRC diagnosed within 24 months of a negative FIT. Factors associated with FIT-ICs were identified using logistic regression.
Of 387 215 individuals invited, 57.4% participated with 6.1% returning positive FIT results. Final analysis included 520 CRC, of which 111 (21.3%) met FIT-IC definition. Overall FIT sensitivity for CRC was 78.7% (95% CI=74.9% to 82.1%), specificity was 94.1% (95% CI=94.0% to 94.2%). In 78 (70.3%) participants with FIT-IC, faecal Hb was reported as undetectable. There were no significant associations between FIT-IC and age, sex, ethnicity and deprivation. FIT-ICs were significantly associated with proximal tumour location, late stage at diagnosis, high-grade tumour differentiation and subsequent round screens. Median follow-up time was 74 (2–124) months. FIT-IC had significantly poorer overall survival.
FIT sensitivity in BSP compared favourably to published data. FIT-ICs were more likely to be proximal tumours with poor long-term outcomes. Further lowering of FIT threshold would have minimal impact on FIT-IC.
Cirrhosis describes the end-stage of chronic liver disease. Irreversible changes in the liver cause portal hypertension, which can progress to serious complications and death. Only a few studies with small sample sizes have investigated the prognosis of cirrhosis with portal hypertension. We used electronic healthcare records to examine liver-related outcomes in patients with diagnosed/suspected portal hypertension.
This retrospective observational cohort study used secondary health data between 1 January 2017 and 3 December 2020 from the TriNetX Network, a federated electronic healthcare records platform. Three patient groups with cirrhosis and diagnosed/suspected portal hypertension were identified (‘most severe’, ‘moderate severity’ and ‘least severe’). Outcomes studied individually and as a composite were variceal haemorrhage, hepatic encephalopathy, complications of ascites and recorded mortality up to 24 months.
There were 13 444, 23 299, and 23 836 patients in the most severe, moderate severity and least severe groups, respectively. Mean age was similar across groups; most participants were white. The most common individual outcomes at 24 months were variceal haemorrhage in the most severe group, recorded mortality and hepatic encephalopathy in the moderate severity group, and recorded mortality in the least severe group. Recorded mortality rate was similar across groups. For the composite outcome, cumulative incidence was 59% in the most severe group at 6 months. Alcohol-associated liver disease and metabolic-associated steatohepatitis were significantly associated with the composite outcome across groups.
Our analysis of a large dataset from electronic healthcare records illustrates the poor prognosis of patients with diagnosed/suspected portal hypertension.
The association between colorectal cancer (CRC) and nutrients has been studied frequently. However, the association of nutrient density of diets with the risk of CRC has been less studied. This study aimed to investigate the association between CRC and naturally nutrient rich (NNR) score in Iranian adults.
This case-control study included 160 patients with colorectal cancer and 320 controls aged 35–70 years in Tehran, Iran. Dietary intake was assessed using a 168-item food frequency questionnaire. The NNR score was obtained by calculating the average daily value of 14 nutrients including protein, vitamins A, C, D, E, B1, B2, B12, calcium, zinc, iron, folate, potassium and unsaturated fatty acids.
Regarding dietary intake of the components of NNR score, the case group had a lower intake of polyunsaturated fat (15.41±4.44 vs 16.54±4.20 g/day, p=0.01), vitamin E (10.15±4.16 vs 13.1±5.33; p=0.001), vitamin B1 (2±0.86 vs 2.19±0.84 mg/day, p=0.03) and folate (516.45±96.59 vs 571.05±80.31; p=0.001) and a higher intake of oleic acid (8.21±5.46 vs 5.59±3.17 g/day, p=0.01) compared with the control group. Colorectal cancer risk was inversely associated with the NNR score after adjusting for the confounders (OR 0.92; 95% CI 0.88 to 0.97; p=0.03).
Low NNR scores may be linked to CRC. If confirmed by future longitudinal research, this result may help prevent CRC by recommending nutrient-rich diets.
Short-term exercise prehabilitation programmes have demonstrated promising results in improving aerobic capacity of unfit patients prior to major abdominal surgery. However, little is known about the cardiac and skeletal muscle adaptations explaining the improvement in aerobic capacity following short-term exercise prehabilitation.
In this single-centre study with a pretest–post-test design, 12 unfit patients with a preoperative oxygen uptake (VO2) at the ventilatory anaerobic threshold ≤13 mL/kg/min and/or VO2 at peak exercise ≤18 mL/kg/min, who are scheduled to undergo hepatopancreatobiliary surgery at the University Medical Center Groningen (UMCG), the Netherlands, will be recruited. As part of standard care, unfit patients are advised to participate in a home-based exercise prehabilitation programme, comprising high-intensity interval training and functional exercises three times per week, combined with nutritional support, during a 4-week period. Pre-intervention and post-intervention, patients will complete a cardiopulmonary exercise test. Next to this, study participants will perform additional in-vivo exercise cardiac magnetic resonance (MR) imaging and phosphorus 31-MR spectroscopy of the quadriceps femoris muscle before and after the intervention to assess the effect on respectively cardiac and skeletal muscle function.
This study was approved in May 2023 by the Medical Research Ethics Committee of the UMCG (registration number NL83611.042.23, March 2023) and is registered in the ClinicalTrials.gov register. Results of this study will be submitted for presentation at (inter)national congresses and publication in peer-reviewed journals.
Poor sleep is common in inflammatory bowel disease (IBD) and may be associated with overall worse disease outcomes. While the sleep/IBD literature is growing, the data are often self-reported. Further, much of the research using objective measures of sleep architecture, or the overall pattern of sleep depth, rely on single-night assessments, which can be of questionable validity.
Participants with IBD and healthy controls were recruited from Dartmouth-Hitchcock Medical Center as part of a two-phase clinical trial. Sleep architecture was assessed using three nights of in-home electroencephalographic monitoring and scored according to the American Academy of Sleep Medicine guidelines.
Our sample included 15 participants with IBD and 8 healthy controls. Participants with IBD were more psychiatrically complex, with more self-reported insomnia, anxiety and depression. Participants with IBD evidenced greater microarousals than healthy controls. In participants with IBD, microarousals were associated with lower insomnia and greater depression scores. Within IBD, participants with clinically significant insomnia evidenced trend towards lower sleep efficiency, while self-reported disease activity did not significantly impact findings.
The methodology of past research may have impacted findings, including the reliance on single-night assessments and limited generalisability. Future research that uses robust, multinight assessments of sleep architecture in large, diverse samples is clearly warranted, as is research exploring the impact of cognitive and behavioural factors on sleep architecture and arousal.
High rectal sensory thresholds (RSTs) are associated with chronic constipation (CC), especially in older patients. Bile acids (BAs) affect the RSTs of healthy individuals. Here, we aimed to investigate the effects of the BA transporter inhibitor elobixibat in patients with CC aged ≥60 years.
We prospectively compared the RSTs of 17 patients with CC aged ≥60 years with those of 9 healthy individuals of the same age range. We next performed a prospective, randomised, parallel-group, double-blind, placebo-controlled clinical trial of 17 patients with CC who administered elobixibat or placebo daily for 1 week. Using barostat methodology, their first constant sensation volume (FCSV), defaecatory desire volume (DDV), and maximum tolerable volume (MTV) thresholds; their rectal compliance; and their faecal BA concentrations were measured before and after treatment.
There were no significant differences in the RSTs of healthy individuals and patients with CC, but all of these tended to be higher in the latter group. Elobixibat increased the desire to defaecate, significantly reduced the threshold for FCSV (p=0.0018), and tended to reduce the threshold for DDV (p=0.0899) versus placebo. However, there were no differences in the MTV or rectal compliance of the two groups. The total faecal BA concentration increased, and particularly that of secondary BAs in the elobixibat group. Elobixibat was most efficacious in participants with a longer duration of CC and a history of treatment for CC.
Elobixibat reduces the RSTs of patients with CC aged ≥60 years, which may be important for its therapeutic effects.
jRCTs061200030.
Many prescribed and over-the-counter medications, for example, non-steroidal anti-inflammatory drugs (NSAIDs) are associated with upper gastrointestinal bleeding (UGIB). Recently, a decrease in prescribing of NSAIDs was observed in the Netherlands, but whether a similar decreasing trend could be observed in the incidence of severe UGIB (either fatal or requiring hospitalisation), contingent on medication prescription, is unknown.
We conducted a cohort study using Dutch national statistics on pharmacy claims, hospitalisation and mortality between 2013 and 2018. We explored the incidence of sex-specific and age-specific severe UGIB in four (sub)populations: (A) total population, (B) without a filled prescrption for NSAIDs, (C) without filled prescriptions for NSAIDs and antithrombotic agents, (D) without any risk factors for UGIB.
The cumulative incidence of severe UGIB did not decrease throughout the study period, regardless of the subgroup analysis. In the total population, it was 199 per 100 000 inhabitants (95% CI 197 to 201) in 2013–2014 and 260 (95% CI 258 to 263) in 2017–2018. The absolute risk of severe UGIB was 50% lower in the subgroup B than in the full cohort. It decreased further by 50% in the subgroup D when compared with subgroup B. The risk of severe UGIB was 1.5–1.9 fold higher in young women than in young men; an indication of over-the-counter NSAIDs use being more prevalent in women than men in this age group.
We found no evidence to support a relationship between reduced prescribing of NSAIDs and the incidence of severe UGIB in the Netherlands since 2013. The relationship was also not observed when we removed the effect of risk factors.
The aim of the study was to determine the prevalence of coeliac disease (CD) and to recognise Human leukocyte antigen (HLA)-associated hereditary susceptibility to Sudanese CD patients with type 1 diabetes mellitus (DM1).
Antitissue transglutaminase IgA (anti-TG IgA) was measured in the serum of 373 children affected with DM1 aged 1–19-year old and in 100 serum samples from non-diabetic control children. Histological examination was performed in 19 children seropositive for anti-TG IgA (17 DMI and 2 controls). Additionally, PCR-based analysis of Major histocompatibility complex, class II, DQ beta 1 (HLA-DQB1) genotyping was implemented in three study population groups as follows: group 1 (n=25) (+ve DM1 and +ve CD), group 2 (n=63) (-ve DM1 and +ve CD) and control group 3 (n=2) (+ve CD).
Twenty-six Sudanese children with DM1 out of 373 (6.97%) were seropositive for anti-TG IgA. Duodenal biopsy revealed Marsh 2 and 3 in 13 out of 17 (76.47%) seropositive anti-TG IgA patients with DM1. Significant association (p<0.05) was detected between the level of anti-TG IgA autoantibodies (IU/mL) and Marsh stage. HLA DQ2 and DQ8 were found in 88% (22/25) and 8% (2/25) of examined patients with CD with DM1, respectively.
Anti-TG IgA titre of greater than 10 times upper limit of normal (≥10x ULN) can be useful for detecting CD in children with type 1 diabetes without duodenal biopsy. HLA testing in children with DM1 appears to provide little added benefit given the high prevalence (96%) of HLA DQ2/DQ8 in children with DM1.
The transjugular intrahepatic portosystemic shunt (TIPS) procedure is an important intervention for management of complications of portal hypertension. The objective of this study was to identify predictors of mortality from the TIPS procedure with a focus on race and ethnicity.
TIPS procedures from 2012 to 2014 in the National Inpatient Sample were identified. Weighting was applied to generate nationally representative results. In-hospital mortality was the primary outcome of interest. 2 and Student’s t-tests were performed for categorical and continuous variables, respectively. Predictors of mortality following TIPS were assessed by survey-weighted logistic regression.
17 175 (95% CI 16 254 to 18 096) TIPS cases were identified. Approximately 71% were non-Hispanic (NH) white, 6% were NH black, 16% were Hispanic and 7% were other. NH black patients undergoing TIPS had an in-hospital mortality rate of 20.1%, nearly double the in-hospital mortality of any other racial or ethnic group. NH black patients also had significantly longer median postprocedure and total lengths of stay (p=0.03 and p<0.001, respectively). The interaction of race by clinical indication was a significant predictor of in-hospital mortality (p<0.001). NH black patients had increased mortality compared with other racial/ethnic groups when presenting with bleeding oesophageal varices (OR 3.85, 95% CI 2.14 to 6.95).
This cohort study presents important findings in end-stage liver disease care, with clear racial disparities in in-hospital outcomes following the TIPS procedure. Specifically, black patients had significantly higher in-hospital mortality and longer lengths of stay. Further research is needed to understand how we can better care for black patients with liver disease.
Acute severe ulcerative colitis (ASUC) traditionally requires inpatient hospital management for intravenous therapies and/or colectomy. Ambulatory ASUC care has not yet been evaluated in large cohorts.
We used data from PROTECT, a UK multicentre observational COVID-19 inflammatory bowel disease study, to report the extent, safety and effectiveness of ASUC ambulatory pathways.
Adults (≥18 years old) meeting Truelove and Witts criteria between 1 January 2019–1 June 2019 and 1 March 2020–30 June 2020 were recruited to PROTECT. We used demographic, disease phenotype, treatment outcomes and 3-month follow-up data. Primary outcome was rate of colectomy during the index ASUC episode. Secondary outcomes included corticosteroid response, time to and rate of rescue or primary induction therapy, response to rescue or primary induction therapy, time to colectomy, mortality, duration of inpatient treatment and hospital readmission and colectomy within 3 months of index flare. We compared outcomes in three cohorts: (1) patients treated entirely in inpatient setting; ambulatory patients subdivided into; (2) patients managed as ambulatory from diagnosis and (3) patients hospitalised and subsequently discharged to ambulatory care for continued intravenous steroids.
37% (22/60) participating hospitals used ambulatory pathways. Of 764 eligible patients, 695 (91%) patients received entirely inpatient care, 15 (2%) patients were managed as ambulatory from diagnosis and 54 (7%) patients were discharged to ambulatory pathways. Aside from younger age in patients treated as ambulatory from diagnosis, no significant differences in disease or patient phenotype were observed. The rate of colectomy (15.0% (104/695) vs 13.3% (2/15) vs 13.0% (7/54), respectively, p=0.96) and secondary outcomes were similar among all three cohorts. Stool culture and flexible sigmoidoscopy were less frequently performed in ambulatory cohorts. Forty per cent of patients treated as ambulatory from diagnosis required subsequent hospital admission.
In a post hoc analysis of one of the largest ASUC cohorts collected to date, we report an emerging UK ambulatory practice which challenges treatment paradigms. However, our analysis remains underpowered to detect key outcome measures and further studies exploring clinical and cost-effectiveness as well as patient and physician acceptability are needed.
Cholecystectomy is a standard treatment in the management of symptomatic gallstone disease. Current literature has contradicting views on the cost-effectiveness of different cholecystectomy treatments. We have conducted a systematic reappraisal of literature concerning the cost-effectiveness of cholecystectomy in management of gallstone disease.
We systematically searched for economic evaluation studies from PubMed, Embase and Scopus for eligible studies from inception up to July 2020. We pooled the incremental net benefit (INB) with a 95% CI using a random-effects model. We assessed the heterogeneity using the Cochrane-Q test, I2 statistic. We have used the modified economic evaluation bias (ECOBIAS) checklist for quality assessment of the selected studies. We assessed the possibility of publication bias using a funnel plot and Egger’s test.
We have selected 28 studies for systematic review from a search that retrieved 8710 studies. Among them, seven studies were eligible for meta-analysis, all from high-income countries (HIC). Studies mainly reported comparisons between surgical treatments, but non-surgical gallstone disease management studies were limited. The early laparoscopic cholecystectomy (ELC) was significantly more cost-effective compared with the delayed laparoscopic cholecystectomy (DLC) with an INB of US$1221 (US$187 to US$2255) but with high heterogeneity (I2=73.32%). The subgroup and sensitivity analysis also supported that ELC is the most cost-effective option for managing gallstone disease or cholecystitis.
ELC is more cost-effective than DLC in the treatment of gallstone disease or cholecystitis in HICs. There was insufficient literature on comparison with other treatment options, such as conservative management and limited evidence from other economies.
CRD42020194052.
Chronic rejection (CR) of the small intestinal allograft includes mucosal fibrosis, bowel thickening and arteriopathy in the outer wall layers and the mesentery. CR lacks non-invasive markers and reliable diagnostic methods. We evaluated endoscopic ultrasound (EUS) as a novel approach for monitoring of the intestinal allograft with respect to CR.
In intestinal graft recipients, EUS and enteroscopy with ileal mucosal biopsy were performed via the ileostomy. At EUS, the wall thickness of the intestinal graft was measured in standard mode, whereas the resistive index (RI) of the supplying artery was assessed in pulsed Doppler mode. At enteroscopy, the intestinal mucosa was assessed. Findings were compared with histopathology and clinical follow-up.
EUS was successfully performed in all 11 patients (adequate clinical course (AC) n=9; CR n=2) after a median interval of 1537 days (range: 170–5204), post-transplantation. The total diameter of the wall (layer I–V) was comparable in all patients. Meanwhile, the diameter of the outermost part (layer IV–V; that is, muscularis propria–serosa) was among the two CR patients (range: 1.3–1.4 mm) in the upper end of measurements as compared with the nine AC patients (range: 0.5–1.4 mm). The RI was >0.9 in both CR patients, while the RI was ≤0.8 in all AC patients. Both CR patients had abnormal findings at enteroscopy and histopathology and deceased during follow-up.
EUS is a promising tool providing detailed information on the intestinal graft morphology and rheology, which may be used for assessment of potential CR in long-term follow-up of intestinal allograft recipients.
Northern England has been experiencing a persistent rise in the number of primary liver cancers, largely driven by an increasing incidence of hepatocellular carcinoma (HCC) secondary to alcohol-related liver disease and non-alcoholic fatty liver disease. Here we review the effect of the COVID-19 pandemic on primary liver cancer services and patients in our region.
To assess the impact of the COVID-19 pandemic on patients with newly diagnosed liver cancer in our region.
We prospectively audited our service for the first year of the pandemic (March 2020–February 2021), comparing mode of presentation, disease stage, treatments and outcomes to a retrospective observational consecutive cohort immediately prepandemic (March 2019–February 2020).
We observed a marked decrease in HCC referrals compared with previous years, falling from 190 confirmed new cases to 120 (37%). Symptomatic became the the most common mode of presentation, with fewer tumours detected by surveillance or incidentally (% surveillance/incidental/symptomatic; 34/42/24 prepandemic vs 27/33/40 in the pandemic, p=0.013). HCC tumour size was larger in the pandemic year (60±4.6 mm vs 48±2.6 mm, p=0.017), with a higher incidence of spontaneous tumour haemorrhage. The number of new cases of intrahepatic cholangiocarcinoma (ICC) fell only slightly, with symptomatic presentation typical. Patients received treatment appropriate for their cancer stage, with waiting times shorter for patients with HCC and unchanged for patients with ICC. Survival was associated with stage both before and during the pandemic. 9% acquired COVID-19 infection.
The pandemic-associated reduction in referred patients in our region was attributed to the disruption of routine healthcare. For those referred, treatments and survival were appropriate for their stage at presentation. Non-referred or missing patients are expected to present with more advanced disease, with poorer outcomes. While protective measures are necessary during the pandemic, we recommend routine healthcare services continue, with patients encouraged to engage.
SARS-CoV-2 and consequent pandemic has presented unique challenges. Beyond the direct COVID-related mortality in those with liver disease, we sought to determine the effect of lockdown on people with liver disease in Scotland. The effect of lockdown on those with alcohol-related disease is of interest; and whether there were associated implications for a change in alcohol intake and consequent presentations with decompensated disease.
We performed a retrospective analysis of patients admitted to seven Scottish hospitals with a history of liver disease between 1 April and 30 April 2020 and compared across the same time in 2017, 2018 and 2019. We also repeated an intermediate assessment based on a single centre to examine for delayed effects between 1 April and 31 July 2020.
We found that results and outcomes for patients admitted in 2020 were similar to those in previous years in terms of morbidity, mortality, and length of stay. In the Scotland-wide cohort: admission MELD (Model for End-stage Liver Disease) (16 (12–22) vs 15 (12–19); p=0.141), inpatient mortality ((10.9% vs 8.6%); p=0.499) and length of stay (8 days (4–15) vs 7 days (4–13); p=0.140). In the Edinburgh cohort: admission MELD (17 (12–23) vs 17 (13–21); p=0.805), inpatient mortality ((13.7% vs 10.1%; p=0.373) and length of stay (7 days (4–14) vs 7 days (3.5–14); p=0.525)).
This assessment of immediate and medium-term lockdown impacts on those with chronic liver disease suggested a minimal effect on the presentation of decompensated liver disease to secondary care.
In non-alcoholic fatty liver disease (NAFLD), fibrosis determines the risk of liver complications. Non-invasive tests (NITs) such as FIB-4, NAFLD Fibrosis Score (NFS) and Hepamet, have been proposed as a tool to triage NAFLD patients in primary care (PC). These NITs include AST±ALT in their calculations. Many patients with NAFLD take statins, which can affect AST/ALT, but it is unknown if statin affects NITs fibrosis prediction.
We included 856 patients referred through a standardised pathway from PC with a final diagnosis of NAFLD. 832 had reliable vibration controlled transient elastography (VCTE) measurements. We assessed the effects of statins on the association between NITs and VCTE at different fibrosis thresholds.
129 out of 832 patients were taking a statin and 138 additional patients had indication for a statin. For any given FIB-4 value, patients on a statin had higher probabilities of high VCTE than patients not on a statin. Adjusting for body mass index, diabetes and age almost completely abrogated these differences, suggesting that these were related to patient’s profile rather to a specific effect of statins. Negative predictive values (NPVs) of FIB-4 <1.3 for VCTE >8, 10, 12 and 16 were, respectively, 89, 94, 96% and 100% in patients on a statin and 92, 95, 98% and 99% in patients not on a statin. Statins had similar impact on Hepamet predictions but did not modify NFS predictions.
In patients with NAFLD referred from PC, those on statins had higher chances of a high VCTE for a given FIB-4 value, but this had a negligible impact on the NPV of the commonly used FIB-4 threshold (<1.3).
Persistent cholestasis may follow acute liver failure (ALF), but its course remains unknown. We aimed to describe the prevalence, onset, severity, duration and resolution of post-ALF cholestasis.
Cohort of 127 adult patients with ALF at a liver transplantation centre identified using electronic databases. We obtained laboratory data every 6 hours for the first week, daily until day 30 and weekly, when documented, until day 180.
Median age was 40.7 (IQR 31.0–52.4) years, median peak alanine aminotransferase level was 5494 (2521–8819) U/L and 87 (68.5%) cases had paracetamol toxicity. Overall, 12.6% underwent transplantation (3.4% for paracetamol vs 32.5% for non-paracetamol; p<0.001). Ninety-day mortality was 20.7% for paracetamol versus 30.0% for non-paracetamol patients. All non-transplanted survivors reached a bilirubin level>50 µmol/L, which peaked 3.5 (1.0–10.1) days after admission at 169.0 (80.0–302.0) µmol/L. At hospital discharge, 18.8% of patients had normal bilirubin levels and, at a median follow-up time from admission to last measurement of 16 (10-30) days, 46.9% had normal levels. Similarly, there was an increase in alkaline phosphatase (ALP) (207.0 (148.0–292.5) U/L) and gamma-glutamyl transferase (GGT) (336.0 (209.5–554.5) U/L) peaking at 4.5 days, with normalised values in 40.3% and 8.3% at hospital discharge.
Post-ALF cholestasis is ubiquitous. Bilirubin, ALP and GGT peak at 3 to 5 days and, return to baseline in the minority of patients at median follow-up of 16 days. These data inform clinical expectations of the natural course of this condition.
Biliodigestive leaks are typically caused by an insufficiency at the surgical anastomosis. Biliodigestive anastomosis (BDA) insufficiencies can lead to bilomas, abscesses and vascular erosion in chronic conditions.
We performed a retrospective analysis of the medical and radiological records of all patients with biliodigestive insufficiency who received interventional treatment between July 2015 and February 2021. Nine patients (three with unilateral drainage and six with bilateral drainage) were treated with a modified percutaneous transhepatic cholangiodrainage (PTCD). Clinical success was considered after complete resolution of the peribiliary collections, absence of bile within the surgical drains, radiological patency of the BDA (contrast medium flowing properly through the BDA and no signs of leakage) and haemodynamic stability of the patient without signs of sepsis.
Clinical success was achieved in all nine patients. No patients required revision surgery to repair their BDA. The mean indwelling drainage time was 34.8±16.5 days. The mean number of interventional procedures performed per patient was 6.6±2.0.
Patients who present with BDA insufficiency may benefit from interventional radiological techniques. Our modified PTCD resolved the BDA leak in all nine cases and should be considered as a valuable option for the treatment of patients with this complication. Our technique demonstrated to be feasible and effective.
Evidence indicates that multistrain probiotics benefit preterm infants more than single-strain (SS) probiotics. We assessed the effects of SS versus triple-strain (TS) probiotic supplementation (PS) in extremely preterm (EP) infants.
EP infants (gestational age (GA) <28 weeks) were randomly allocated to TS or SS probiotic, assuring blinding. Reference (REF) group was EP infants in the placebo arm of our previous probiotic trial. PS was commenced with feeds and continued until 37 weeks’ corrected GA. Primary outcome was time to full feed (TFF: 150 mL/kg/day). Secondary outcomes included short-chain fatty acids and faecal microbiota collected at T1 (first week) and T2 (after 3 weeks of PS) using 16S ribosomal RNA gene sequencing.
173 EP (SS: 86, TS: 87) neonates with similar GA and birth weight (BW) were randomised. Median TFF was comparable (11 (IQR 8–16) vs 10 (IQR 8–16) days, p=0.92). Faecal propionate (SS, p<0.001, and TS, p=0.0009) and butyrate levels (TS, p=0.029) were significantly raised in T2 versus T1 samples. Secondary clinical outcomes were comparable. At T2, alpha diversity was comparable (p>0.05) between groups, whereas beta-diversity analysis revealed significant differences between PS and REF groups (both p=0.001). Actinobacteria were higher (both p<0.01), and Proteobacteria, Firmicutes and Bacteroidetes were lower in PS versus REF. Gammaproteobacteria, Clostridia and Negativicutes were lower in both PS versus REF.
TFF in EP infants was similar between SS and TS probiotics. Both probiotics were effective in reducing dysbiosis (higher bifidobacteria and lower Gammaproteobacteria). Long-term significance of increased propionate and butyrate needs further studies.
ACTRN 12615000940572.
The aim of this study was to investigate the association between obesity, diabetes and metabolic related liver dysfunction and the incidence of cancer.
This study was conducted with health record data available from the National Health Service in Tayside and Fife. Genetics of Diabetes Audit and Research Tayside, Scotland (GoDARTS), Scottish Health Research Register (SHARE) and Tayside and Fife diabetics, three Scottish cohorts of 13 695, 62 438 and 16 312 patients, respectively, were analysed in this study. Participants in GoDARTS were a volunteer sample, with half having type 2 diabetes mellitus(T2DM). SHARE was a volunteer sample. Tayside and Fife diabetics was a population-level cohort. Metabolic dysfunction-related liver disease (MDLD) was defined using alanine transaminase measurements, and individuals with alternative causes of liver disease (alcohol abuse, viruses, etc) were excluded from the analysis.
MDLD associated with increased cancer incidence with a HR of 1.31 in a Cox proportional hazards model adjusted for sex, type 2 diabetes, body mass index(BMI), and smoking status (95% CI 1.27 to 1.35, p<0.0001). This was replicated in two further cohorts, and similar associations with cancer incidence were found for Fatty Liver Index (FLI), Fibrosis-4 Index (FIB-4) and non-alcoholic steatohepatitis (NASH). Homozygous carriers of the common non-alcoholic fatty liver disease (NAFLD) risk-variant PNPLA3 rs738409 had increased risk of cancer. (HR=1.27 (1.02 to 1.58), p=3.1x10–2). BMI was not independently associated with cancer incidence when MDLD was included as a covariate.
MDLD, FLI, FIB-4 and NASH associated with increased risk of cancer incidence and death. NAFLD may be a major component of the relationship between obesity and cancer incidence.
Chronic liver disease continues to be a significant cause of morbidity and mortality yet remains challenging to prognosticate. This has been one of the barriers to implementing palliative care, particularly at an early stage. The Bristol Prognostic Score (BPS) was developed to identify patients with life expectancy less than 12 months and to act as a trigger for referral to palliative care services. This study retrospectively evaluated the BPS in a cohort of patients admitted to three Scottish hospitals.
Routinely collated healthcare data were used to obtain demographics, BPS and analyse 1-year mortality for patients with decompensated liver disease admitted to three gastroenterology units over two 90-day periods. Statistical analysis was undertaken to assess performance of BPS in predicting mortality.
276 patients were included in the final analysis. Participants tended to be late middle-aged men, socioeconomically deprived and have alcohol-related liver disease. A similar proportion was BPS+ve (>3) in this study compared with the original Bristol cohort though had more hospital admissions, higher ongoing alcohol use and poorer performance status. BPS performed poorer in this non-Bristol group with sensitivity 54.9% (72.2% in original study), specificity 58% (83.8%) and positive predictive value (PPV) 43.4% (81.3%).
BPS was unable to accurately predict mortality in this Scottish cohort. This highlights the ongoing challenge of prognostication in patients with chronic liver disease, furthering the call for more work in this field.
To describe a conceptual framework that provides understanding of the challenges encountered and the adaptive approaches taken by organised colorectal cancer (CRC) screening programmes during the initial phase of the COVID-19 pandemic.
This was a qualitative case study of international CRC screening programmes. Semi-structured interviews were conducted with programme managers/leaders and programme experts, researchers and clinical leaders of large, population-based screening programmes. Data analysis, using elements of grounded theory, as well as cross-cases analysis was conducted by two experienced qualitative researchers.
19 participants were interviewed from seven programmes in North America, Europe and Australasia. A conceptual framework (‘Nimble Approach’) was the key outcome of the analysis. Four concepts constitute this approach to managing CRC screening programmes during COVID-19: Fast (meeting the need to make decisions and communicate quickly), Adapting (flexibly and creatively managing testing/colonoscopy capacity, access and backlogs), Calculating (modelling and actively monitoring programmes to inform decision-making and support programme quality) and Ethically Mindful (considering ethical conundrums emerging from programme responses). Highly integrated programmes, those with highly integrated communication networks, and that managed greater portions of the screening process seemed best positioned to respond to the crisis.
The Nimble Approach has potentially broad applications; it can be deployed to effectively respond to programme-specific challenges or manage CRC programmes during future pandemics, other health crises or emergencies.
Barrett’s oesophagus (BO) is common and is a precursor to oesophageal adenocarcinoma with a 0.33% per annum risk of progression. Surveillance and follow-up services for BO have been shown to be lacking, with studies showing inadequate adherence to guidelines and patients reporting a need for greater disease-specific knowledge. This review explores the emerging role of dedicated services for patients with BO.
A literature search of PubMed, MEDLINE, Embase, Emcare, HMIC, BNI, CiNAHL, AMED and PsycINFO in regard to dedicated BO care pathways was undertaken.
Prospective multicentre and randomised trials were lacking. Published cohort data are encouraging with improvements in guideline adherence with dedicated services, with one published study showing significant improvements in dysplasia detection rates. Accuracy of allocation to surveillance endoscopy has been shown to hold cost savings, and a study of a dedicated clinic showed increased discharges from unnecessary surveillance. Training modalities for BO surveillance and dysplasia detection exist, which could be used to educate a BO workforce. Qualitative and quantitative studies have shown patients report high levels of cancer worry and poor disease-specific knowledge, but few studies have explored follow-up care models despite being a patient and clinician priority for research.
Cost–benefit analysis for dedicated services, considering both financial and environmental impacts, and more robust clinical data must be obtained to support this model of care in the wider health service. Greater understanding is needed of the root causes for poor guideline adherence, and disease-specific models of care should be designed around clinical and patient-reported outcomes to address the unmet needs of patients with BO.
Non-alcoholic fatty liver disease (NAFLD) is increasing globally with a mounting body of evidence on various adverse effects on pregnancy. Yet, prospective studies, especially from low-income and middle-income countries, are lacking in examining the impact of NAFLD in pregnancy. In this study, we explored the effect of NAFLD on the development of gestational diabetes mellitus (GDM) and early pregnancy miscarriages.
A population-based prospective cohort study was conducted among first-trimester pregnant women who registered in the national pregnancy care programme during July–September 2019 in Anuradhapura district, Sri Lanka. Baseline clinical–biochemical parameters and ultrasound scan (USS) of the liver were done to assess fatty liver. Those who were normoglycaemic based on WHO criteria were followed up, and a repeat oral glucose tolerance test was performed between 24 and 28 weeks of gestation.
Of the 632 pregnant women studied, 90 (14%) and 234 (37%) were diagnosed as having fatty liver grade (FLG) II and I, respectively. The cumulative incidence of GDM in FLG 0, I, and II were 11, 44, and 162 per 1000 pregnancies, respectively. After adjusting for age and other known risk factors, women with FLG II had a relative risk (RR) of 12.5 (95% CI 2.2 to 66.4) for developing GDM compared with FLG 0. In addition, women with FLG I (RR 2.1, 95% CI 1.01 to 4.64) and FLG II (RR 4.5, 95% CI 2.1 to 9.9) were significant risk factors for early pregnancy miscarriages, and FLG II remained as the only independent predictor of miscarriages after adjusting for age, parity, body mass index, blood sugar, blood pressure, and haemoglobin level (adjusted OR 4.2 (95% CI 1.9 to 9.1)).
In this rural south Asian community, NAFLD is shown to be a major risk factor for GDM and early pregnancy miscarriages. Therefore, routine identification of NAFLD through a simple USS may help in the early identification of high-risk mothers.
Human papillomavirus (HPV) is strongly associated with Barrett’s dysplasia and oesophageal cancer suggesting a role in carcinogenesis. HPV persistence predicts treatment failure after endotherapy for Barrett’s dysplasia. This pilot study applies a novel HPV screening tool (previously only used in the oropharynx) to detect HPV DNA directly and determine the prevalence rates in Barrett’s oesophagus (BO).
DNA was extracted from 20 formalin-fixed BO samples. HPV DNA was detected using real-time PCR and gel electrophoresis.
5 out of 20 patients were identified as positive for HPV. Prevalence was 25% in patients with BO.
This method can be used in BO’s tissue to determine HPV infection. Adoption of this as a screening test could potentially revolutionise future research in this area. If a clear link between HPV and Barrett’s dysplasia can be confirmed, this qPCR method has the potential to aid in monitoring and/or dysplasia detection by stratifying those most at risk and aid in the development of new therapies.
Dietary patterns that might induce remission in patients with active Crohn’s disease (CD) are of interest to patients, but studies are limited in the published literature. We aim to explore the efficacy of the CD therapeutic dietary intervention (CD-TDI), a novel dietary approach developed from best practices and current evidence, to induce clinical and biomarker remission in adult patients with active CD.
This study is a 13-week, multicentre, randomised controlled trial in patients with mild-to-moderate active CD at baseline. One hundred and two patients will be block randomised, by sex, 2:1 to the intervention (CD-TDI) or conventional management. Coprimary outcomes are clinical and biomarker remission, defined as a Harvey Bradshaw Index of <5 and a faecal calprotectin of <250 µg/g, respectively.
Secondary outcomes include gut microbiota diversity and composition, faecal short-chain fatty acids, regulatory macrophage function, serum and faecal metabolomics, C reactive protein, peripheral blood mononuclear cell gene expression profiles, quality of life, sedentary time and physical activity at 7 and/or 13 weeks. Predictive models of clinical response to a CD-TDI will be investigated.
The research protocol was approved by the Conjoint Health Research Ethics Board at the University of Calgary (REB19-0402) and the Health Research Ethics Board—Biomedical Panel at the University of Alberta (Pro00090772). Study findings will be presented at national and international conferences, submitted for publication in abstracts and manuscripts, shared on social media and disseminated through patient-education materials.
End-ischaemic preservation of a donor liver by dual hypothermic oxygenated machine perfusion (DHOPE) for 2 hours prior to transplantation is sufficient to mitigate ischaemia-reperfusion damage and fully restore cellular energy levels. Clinical studies have shown beneficial outcomes after transplantation of liver grafts preserved by DHOPE compared with static cold storage. In addition to graft reconditioning, DHOPE may also be used to prolong preservation time, which could facilitate logistics for allocation and transplantation globally.
This is a prospective, pseudo-randomised, dual-arm, IDEAL-D (Idea, Development, Exploration, Assessment, Long term study-Framework for Devices) stage 2 clinical device trial designed to determine safety and feasibility of prolonged DHOPE (DHOPE-PRO). The end-time of the donor hepatectomy will determine whether the graft will be assigned to the intervention (16:00–3:59 hour) or to the control arm (4:00–15:59 hour). In total, 36 livers will be included in the study. Livers in the intervention group (n=18) will undergo DHOPE-PRO (≥4 hours) until implantation the following morning, whereas livers in the control group (n=18) will undergo regular DHOPE (2 hours) prior to implantation. The primary endpoint of this study is a composite of the occurrence of all (serious) adverse events during DHOPE and up to 30 days after liver transplantation.
The protocol was approved by the Medical Ethical Committee of Groningen, METc2020.126 in June 2020, and the study was registered in the Netherlands National Trial Registry (https://www.trialregister.nl/) prior to initiation.
NL8740.
This rapid priority setting exercise aimed to identify, expand, prioritise and explore stakeholder (patients, carers and healthcare practitioners) topic uncertainties on faecal incontinence (FI).
An evidence gap map (EGM) was produced to give a visual overview of emerging trial evidence; existing systematic review-level evidence and FI stakeholder topic uncertainties derived from a survey. This EGM was used in a knowledge exchange workshop that promoted group discussions leading to the prioritisation and exploration of FI stakeholder identified topic uncertainties.
Overall, a mismatch between the existing and emerging evidence and key FI stakeholder topic uncertainties was found. The prioritised topic uncertainties identified in the workshop were as follows: psychological support; lifestyle interventions; long-term effects of living with FI; education; constipation and the cultural impact of FI. When these six prioritised topic uncertainties were explored in more depth, the following themes were identified: education; impact and burden of living with FI; psychological support; healthcare service improvements and inconsistencies; the stigma of FI; treatments and management; culturally appropriate management and technology and its accessibility.
Topic uncertainties identified were broad and wide ranging even after prioritisation. More research is required to unpick the themes emerging from the in-depth discussion and explore these further to achieve a consensus on deliverable research questions.
Many patients are assessed for chronic symptoms including: dysphonia, ‘globus’, throat clearing, postnasal secretions and cough; commonly grouped together and attributed to ‘laryngopharyngeal reflux’. This study aimed to explore a clinical trial’s baseline dataset for patterns of presenting symptoms, which might provide a more rational basis for treatment.
Baseline data were analysed for participants entering the Trial Of Proton-Pump Inhibitors in Throat Symptoms: age, body mass index, Reflux Symptom Index, Comprehensive Reflux Symptom Score, Laryngopharyngeal Reflux-Health-related Quality of Life questionnaire and Reflux Finding Score (RFS-endoscopic examination). The relationships between the questionnaires and demographic factors were assessed. Exploratory factor analysis (EFA) was conducted on individual symptom items in the combined questionnaires. The EFA factors were applied to a Cluster Analysis of participants, to explore the presence of identifiable patient.
Throat clearing and globus were the highest ranked scores in the 344 participants. Increasing age was inversely associated with symptom severity (p<0.01). There was no relationship between the RFS and any of the three questionnaires. EFA resulted in a seven-factor model with clinically meaningful labels: voice, cough, gastrointestinal symptoms, airway symptoms and dysphagia, throat clearing, lump in throat, and life events. Cluster analysis failed to demonstrate any clinically meaningful clusters of patients.
This study offers a framework for future research and demonstrates that individual symptoms cannot be used to group patients. The analysis supports the use of a broad ‘umbrella’ term such as persistent throat symptoms.
Inflammatory bowel disease clinical nurse specialists (IBD-CNSs) face increasing pressures due to rising clinical and patient demands, advanced complexity of work role, and minimal specialist management training and support. Stress and burn-out could undermine the stability of this workforce, disrupting clinical provision. We reviewed the literature on stress and burn-out to demonstrate the lack of evidence pertinent to IBD-CNSs and make the case for further research.
Following Levac et al’s scoping review framework, relevant databases were searched for publications reporting work-related stress and burn-out among specialist nurses. Following screening and consensus on selection of the final articles for review, all authors contributed to data charting. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses Scoping Review extension guided reporting of the review.
Of 194 retrieved articles, eight were eligible for review. None focused on IBD-CNSs, were qualitative, or UK-based. Three core themes were identified: Rates of Burn-out, Mitigating and Alleviating Factors, and Preventing and Resolving Burn-out. Risk of burn-out is greatest in novice and mid-career CNSs. Age and duration in role appear protective. Personal achievement is also protective and can mitigate earlier episodes of burn-out; opportunities for career progression are limited. Promoting personal well-being is beneficial. Senior managers have poor understanding of the role and provide inadequate support. Commitment to patients remains high.
Burn-out arises in CNSs across clinical specialisms in the international literature and has a significant negative effect on the workforce. Further research is needed to address the dearth of evidence on burn-out in IBD-CNSs in the UK.
5-aminosalicylate (mesalazine; 5-ASA) is an established first-line treatment for mild-to-moderate ulcerative colitis (UC). This study aimed to model the benefits of optimising 5-ASA therapy.
A decision tree model followed 10 000 newly diagnosed patients with mild-to-moderately active UC through induction and 1 year of maintenance treatment. Optimised treatment (maximising dose of 5-ASA and use of combined oral and rectal therapy before treatment escalation) was compared with standard treatment (standard doses of 5-ASA without optimisation). Modelled data were derived from published meta-analyses. The primary outcomes were patient numbers achieving and maintaining remission, with an analysis of treatment costs for each strategy conducted as a secondary outcome (using UK reference costs).
During induction, there was a 39% increase in patients achieving remission through the optimised pathway without requiring systemic steroids and/or biologics (6565 vs 4725 for standard). Potential steroidal/biological adverse events avoided included: seven venous thromboembolisms and eight serious infections. Out of the 6565 patients entering maintenance following successful induction on 5-ASA, there was a 21% reduction in relapses when optimised (1830 vs 2311 for standard). This translated into 297 patients avoiding further systemic steroids and 214 biologics. Optimisation led to an average net saving of £272 per patient entering the model for the induction and maintenance of remission over 1 year.
Modelling suggests that optimising 5-ASA therapy (both the inclusion of rectal 5-ASA into a combined oral and rectal regimen and maximisation of 5-ASA dose) has clinical and cost benefits that supports wider adoption in clinical practice.
In order to identify areas of unmet need in patients with primary biliary cholangitis (PBC), this study sought to use real-world observational healthcare data to characterise the burden in patients with PBC and in PBC patients with a recorded diagnosis of pruritus.
This retrospective, cross-sectional database study compared prevalence of prespecified comorbidities and medications in the PBC population and PBC-pruritus subpopulation with non-cases using an indirect standardisation approach. The PBC population was identified from the US IBM MarketScan Commercial Claims and Medicare Supplemental Database during 2016 using International Classification of Diseases 10th Revision, Clinical Modification codes (≥2 claims for PBC); the PBC-pruritus subpopulation additionally had ≥1 claim for pruritus during this period. Non-cases had no claims for PBC. Indirect age-sex standardised prevalence ratios (iSPR) and 95% confidence intervals (CIs) were calculated for prespecified comorbidities and medications recorded during 2017.
The PBC population (N=1963) and PBC-pruritus subpopulation (N=139) had significantly higher prevalence of fatigue (19.9%, iSPR (95% CI): 1.51 (1.36 to 1.66); 26.6%, 2.10 (1.48 to 2.90)), depression/anxiety (21.3%, 1.09 (0.99 to 1.20); 28.1%, 1.46 (1.04 to 2.00)) and sleep-related issues (6.9%, 1.18 (0.99 to 1.40); 14.4%, 2.58 (1.58 to 3.99)) compared with non-cases. Bile acid sequestrants were prescribed in 5.8% and 18.0% of the PBC and PBC-pruritus populations, respectively. In general, a higher prevalence of comorbidities and medication use was observed in the PBC-pruritus subpopulation compared with the PBC population and non-cases.
Despite availability of treatments for PBC, the PBC population had a higher burden of comorbidities than non-cases. This burden was even greater among the PBC-pruritus subpopulation, with a particularly high prevalence of sleep disorders and depression/anxiety. Despite this, pruritus remains undertreated highlighting a need for treatments specifically indicated for cholestatic pruritus.
End-stage chronic liver disease is associated with accelerated ageing and increased frailty. Frailty measures have provided clinical utility in identifying patients at increased risk of poor health outcomes, including those awaiting liver transplantation. However, there is limited data on the prevalence and severity of frailty in patients with non-cirrhotic non-alcoholic fatty liver disease (NAFLD). The aim of this study was to evaluate the prevalence of frailty and prefrailty in patients with non-cirrhotic NAFLD and correlate with severity of liver disease.
A cross-sectional analysis of functional and laboratory frailty assessments, including the Fried frailty index (FFI), a self-reported frailty index (SRFI) and a lab-based frailty index (FI-LAB), was performed in a cohort of 109 patients with NAFLD, and results compared with fibrosis staging based on transient elastography.
Patients with NAFLD had a high prevalence of prefrailty and frailty, with a median SRFI score of 0.18 (IQR: 0.18), FFI of 1 (IQR: 1) and FI-LAB of 0.18 (IQR: 0.12). Using the SRFI, 45% of F0/F1 patients were classified as prefrail and 20% were classified as frail, while in F2/F3 patients this increased to 36% and 41%, respectively. SRFI, 30 s sit-to-stand and FI-LAB scores increased with increasing liver fibrosis stages (p=0.001, 0.006 and <0.001, respectively). On multivariate linear regression, female gender was identified as a significant predictor of elevated frailty scores.
This study identifies a high prevalence of frailty in individuals with non-cirrhotic NAFLD. Addressing frailty through early rehabilitation interventions may reduce overall morbidity and mortality in this population.
The diagnostic performance of endoscopic ultrasound (EUS) for stratification of head of pancreas and periampullary tumours into resectable, borderline resectable and locally advanced tumours is unclear as is the effect of endobiliary stents. The primary aim of the study was to assess the diagnostic performance of EUS for resectability according to stent status.
A retrospective study was performed. All patients presenting with a solid head of pancreas mass who underwent EUS and surgery with curative intent during an 8-year period were included. Factors with possible impact on diagnostic performance of EUS were analysed using logistic regression.
Ninety patients met inclusion criteria and formed the study group. A total of 49 (54%) patients had an indwelling biliary stent at the time of EUS, of which 36 were plastic and 13 were self-expanding metal stents (SEMS). Twenty patients underwent venous resection and reconstruction (VRR). Staging was successfully performed in 100% unstented cases, 97% plastic stent and 54% SEMS, p<0.0001. In successfully staged patients, sensitivity, specificity, accuracy, positive predictive value (PPV) and negative predictive value (NPV) for classification of resectability were 70%, 70%, 70%, 42% and 88%. For vascular involvement (VI), sensitivity, specificity, accuracy, PPV and NPV were 80%, 68%, 69%, 26% and 96%. Increasing tumour size OR 0.53 (95% CI, 0.30 to 0.95) was associated with a decrease in accuracy of VI classification.
EUS has modest diagnostic performance for stratification of staging. Staging was less likely to be completed when a SEMS was in situ. Staging EUS should ideally be performed before endoscopic retrograde cholangiopancreatography and biliary drainage.
Heyde’s syndrome (HS), a rare condition characterised by a unique relationship between severe aortic stenosis and angiodysplasia, is often diagnosed late increasing the risk for a prolonged hospital course and mortality in the elderly. The leading hypothesis explaining the aetiology of HS is acquired von Willebrand syndrome (AVWS) but not all studies support this claim. While individual cases of HS have been reported, here we present the first systematic review of case reports and focus on the prevalence of AVWS.
A systematic search was conducted through PubMed/MEDLINE, CINAHL-EBSCO, Web of Science and Google Scholar since inception. The resulting articles were screened by two independent reviewers based on inclusion criteria that the article must be a case report/series or a letter to the editor in English describing HS in an adult patient.
Seventy-four articles encompassing 77 cases met the inclusion criteria. The average age was 74.3±9.3 years old with a slight female predominance. The small intestine, especially the jejunum, was the most common location for bleeding origin. Capsule endoscopy and double balloon enteroscopy were superior at identifying bleeding sources than colonoscopy (p=0.0027 and p=0.0095, respectively) and oesophagogastroduodenoscopy (p=0.0006 and p=0.0036, respectively). The mean duration from symptom onset to diagnosis/treatment of HS was 23.8±39 months. Only 27/77 cases provided evidence for AVWS. Surgical and transcutaneous aortic valve replacement (AVR) were superior at preventing rebleeding than non-AVR modalities (p<0.0001).
Further research is warranted for a stronger understanding and increased awareness of HS, which may hasten diagnosis and optimal management.
Alanine aminotransferase (ALT) is a marker of hepatic damage and its range can be affected by viral hepatitis, alcoholic hepatitis and non-alcoholic fatty liver diseases. We aimed to study the factors associated with higher ALT level and update the upper limit of normal (ULN) in the Vietnamese population.
This cross-sectional study enrolled 8383 adults, aged 18 years and older who visited the Medical Center at Ho Chi Minh City for a health check-up. Following the exclusion criteria, 6677 subjects were included in the analysis.
Age ≤40 years, male gender, body mass index >23 kg/m2, diastolic blood pressure >85 mm Hg, cholesterol >5.2 mmol/L, triglyceride >1.7 mmol/L, positivity, anti-hepatitis C virus positivity and fatty liver (p<0.05) were associated with higher ALT level (>40 U/L). Without considering age and gender, healthy group is defined after exclusion of participants with one of the mentioned contributing factors. The median ALT level in the healthy group was 18 in men and 13 in women. The ULN at the 95th percentile of the healthy group was 40 U/L in men and 28 U/L in women.
The ULN for ALT in healthy women was lower than in healthy men. Updated ULN for ALT level can promote the identification of unhealthy subjects. More studies that involve ethnicity and lifestyle factors are needed to confirm the new ULN in the Vietnamese population.
The study objective was to compare gut microbiome diversity and composition in SARS-CoV-2 PCR-positive patients whose symptoms ranged from asymptomatic to severe versus PCR-negative exposed controls.
Using a cross-sectional design, we performed shotgun next-generation sequencing on stool samples to evaluate gut microbiome composition and diversity in both patients with SARS-CoV-2 PCR-confirmed infections, which had presented to Ventura Clinical Trials for care from March 2020 through October 2021 and SARS-CoV-2 PCR-negative exposed controls. Patients were classified as being asymptomatic or having mild, moderate or severe symptoms based on National Institute of Health criteria. Exposed controls were individuals with prolonged or repeated close contact with patients with SARS-CoV-2 infection or their samples, for example, household members of patients or frontline healthcare workers. Microbiome diversity and composition were compared between patients and exposed controls at all taxonomic levels.
Compared with controls (n=20), severely symptomatic SARS-CoV-2-infected patients (n=28) had significantly less bacterial diversity (Shannon Index, p=0.0499; Simpson Index, p=0.0581), and positive patients overall had lower relative abundances of Bifidobacterium (p<0.0001), Faecalibacterium (p=0.0077) and Roseburium (p=0.0327), while having increased Bacteroides (p=0.0075). Interestingly, there was an inverse association between disease severity and abundance of the same bacteria.
We hypothesise that low bacterial diversity and depletion of Bifidobacterium genera either before or after infection led to reduced proimmune function, thereby allowing SARS-CoV-2 infection to become symptomatic. This particular dysbiosis pattern may be a susceptibility marker for symptomatic severity from SARS-CoV-2 infection and may be amenable to preinfection, intrainfection or postinfection intervention.
NCT04031469 (PCR–) and 04359836 (PCR+).
Transjugular intrahepatic portosystemic shunt (TIPS) placement is used to treat the sequelae of portal hypertension, including refractory variceal bleeding, ascites and hepatic hydrothorax. However, hernia-related complications such as incarceration and small bowel obstruction can occur after TIPS placement in patients with pre-existing hernias. The aim of this study was to determine the incidence of hernia complications in the first year after TIPS placement and to identify patient characteristics leading to an increased risk of these complications.
This retrospective analysis included patients with pre-existing abdominal hernias who underwent primary TIPS placement with covered stents at our institution between 2004 and 2018. The 1-year hernia complication rate and the average time to complications were documented. Using a Wilcoxon rank-sum test, the characteristics of patients who developed hernia-related complications versus the characteristics of those without complications were compared.
A total of 167 patients with pre-existing asymptomatic abdominal hernias were included in the analysis. The most common reason for TIPS placement was refractory ascites (80.6%). A total of 36 patients (21.6%) developed hernia-related complications after TIPS placement, including 20 patients with acute complications and 16 with non-acute complications. The mean time to presentation of hernia-related complications was 66 days. Patients who developed hernia-related complications were more likely than those without complications to have liver cirrhosis secondary to alcohol consumption (p=0.049), although this association was no longer significant after multivariate analysis.
Within 1 year after TIPS placement, approximately 20% of patients with pre-existing hernias develop hernia-related complications, typically within the first 2 months after the procedure. Patients with pre-existing hernia undergoing TIPS placement should be educated regarding the signs and symptoms of hernia-related complications, including incarceration and small bowel obstruction.
There is a paucity of studies in the literature body evaluating short term outcomes following endoscopic retrograde cholangiopancreatography (ERCP) in patients with inoperable malignant hilar biliary obstruction (MHBO). We aimed to primarily evaluate 30-day mortality in these patients and secondarily, conduct a systematic review of studies reporting 30-day mortality.
We conducted a retrospective analysis of all patients with inoperable MHBO who underwent ERCP at Leeds Teaching Hospitals NHS Trust between February 2015 and September 2020. Logistic regression models constructed from baseline patient data, the modified Glasgow Prognostic Score (mGPS) and Charlson Comorbidity Index (CCI) were evaluated as predictors of 30-day mortality.
Eighty-seven patients (49 males) with a mean age of 70.4 years (SD ±12.3) were included. Cholangiocarcinoma was the most common aetiology of MHBO affecting 35/87 (40.2%). Technical success was achieved in 72/87 (82.8%). The 30-day mortality rate was 25.3% (22/87), of which 16 were due to progression of underlying malignant disease. On multivariate analysis, only leucocytosis (OR 4.12, 95% CI 2.70 to 7.41, p=0.02) was an independent predictor of 30-day mortality. Neither mGPS (p=0.47) nor CCI with a cut-off value of ≥7 (p=0.06) were significant predictors of 30-day mortality.
We demonstrated that 30-day mortality following ERCP for inoperable MHBO remains high despite technical success. Further studies are warranted to identify patients most appropriate for intervention.
MicroRNAs (miRNAs) are implicated in the pathogenesis of autoimmune diseases and could be biomarkers of disease activity. This study aimed to identify highly expressed circulating miRNAs in patients with autoimmune hepatitis (AIH) and to evaluate their association with clinical characteristics.
Microarray analyses were performed, and miRNA expression profiling for AIH, primary biliary cholangitis (PBC) and overlap syndrome (OS) using the serum of patients and healthy individuals was done. Samples were divided into discovery and test sets to identify candidate miRNAs that could discriminate AIH from PBC; the former included 21 AIH and 23 PBC samples, while the latter included five AIH and eight PBC samples.
Among 11 candidate miRNAs extracted in the discovery set, 4 (miR-3196, miR-6125, miR-4725–3 p and miR-4634) were specifically and highly expressed in patients with AIH in the test set. These four miRNAs discriminated AIH from PBC with high sensitivity (0.80–1.00) and specificity (0.88–1.00). In situ hybridisation analysis revealed that these miRNAs were expressed in the cytoplasm of hepatocytes in patients with AIH. Their expression levels were highest in untreated patients with AIH, followed by those in untreated patients with OS. They drastically or moderately decreased after prednisolone treatment. Histological analysis demonstrated that the expression levels of miR-3196, miR-6125 and miR-4634 in patients with AIH and OS were correlated with severe hepatic necroinflammatory activity.
These circulating miRNAs are suggested to reflect hepatic necroinflammatory activity and serve as AIH-related and treatment-responsive biomarkers. These miRNAs could be beneficial in developing new therapeutic strategies for AIH.
Percutaneous gastrostomy (PG) is a common procedure that enables long-term enteral nutrition. However, data on the durability of individual tube types are insufficient. We conducted this study to compare the longevities and features of different PG tube types.
We performed a 5-year retrospective analysis of patients who underwent endoscopic and radiologic PG-related feeding tube procedures. The primary and secondary outcomes were tube exchange intervals and revenue costs, respectively. Demographic factors, underlying diseases, operator expertise, materials used, and complication profiles were assessed.
A total of 599 PG-related procedures for inserting pull-type PG (PGP), balloon-type PG (PGB), PG jejunal MIC* (PGJM; gastrojejunostomy type), and PG jejunal Levin (PGJL) tubes were assessed. On univariate Kaplan-Meier analysis, PGP tubes showed longer median exchange intervals than PGB tubes (405 days (95% CI: 315 to 537) vs 210 days (95% CI: 188 to 238); p<0.001). Larger PGB tubes diameters were associated with longer durations than smaller counterparts (24 Fr: 262 days (95% CI: 201 to NA), 20 Fr: 216 days (95% CI: 189 to 239), and 18 Fr: 148 days (95% CI: 100 to 245)). The PGJL tubes lasted longer than PGJM counterparts (median durations: 168 days (95% CI: 72 to 372) vs 13 days (95% CI: 23 to 65); p<0.001). Multivariate Cox proportional regression analysis revealed that PGJL tubes had significantly lower failure rates than PGJM tubes (OR 2.97 (95% CI: 1.17 to 7.53); p=0.022). PGB tube insertion by general practitioners was the least costly, while PGP tube insertion by endoscopists was 2.9-fold more expensive; endoscopic PGJM tubes were the most expensive at two times the cost of PGJL tubes.
PGP tubes require replacement less often than PGB tubes, but the latter are more cost-effective. Moreover, PGJL tubes last longer than PGJM counterparts and, owing to lower failure rates, may be more suitable for high-risk patients.
If non-invasive markers of liver fibrosis were recorded frequently enough in clinical practice, it might be feasible to use them for opportunistic community screening for liver disease. We aimed to determine their current pattern of usage in the national primary care population in Wales.
Using the Secure Anonymised Information Linkage (SAIL) Databank at Swansea University (2000–2017), we quantified the frequency of common liver blood tests (aspartate aminotransferase (AST), alanine aminotransferase (ALT), platelet count and albumin) used in fibrosis marker algorithms. We examined measurement variation by age and sex.
During the 18-year study period, there were 2 145 178 adult patients with at least one blood test available for analysis. Over the study period, the percentage of SAIL patients receiving an ALT test in each year increased from 2% to 33%, with platelet count and albumin measurement increasing by a similar factor. AST testing, although initially rising, had decreased to 1% by the end of the study. AST and ALT values varied by age and sex, particularly in males with the upper normal range of ALT values decreasing rapidly from 90 U/L at age 30 to 45 U/L by age 80.
The reduction in AST testing to only 1% of the adult population limits the use of many non-invasive liver marker algorithms. To enable widespread screening, alternative algorithms for liver fibrosis that do not depend on AST should be developed. Liver fibrosis markers should be modified to include age-specific and sex-specific normal ranges.
To evaluate the feasibility and diagnostic performance of acoustic radiation force impulse (ARFI) elastography in different omental masses (OM).
This was a retrospective analysis of 106 patients with OM defined as omental thickness ≥1 cm, who underwent abdominal B-mode ultrasound (US) and standardised ARFI examinations of the OM between September 2018 and June 2021 at our university hospital. A cytohistological confirmation was available in 91/106 (85.8%) of all OM, including all 65/65 (100%) malignant OM (mOM) and 26/41 (63.4%) of benign OM (bOM). In 15/41 (36.6%) of bOM; cross-sectional imaging and or US follow-up with a mean duration of 19.8±3.1 months was performed. To examine the mean ARFI velocities (MAV) for potential cut-off values between bOM and mOM a receiver operating characteristic analysis was implemented.
The MAV in the mOM group (2.71±1.04 m/s) was significantly higher than that of bOM group (1.27±0.87 m/s) (p<0.001). Using 1.97 m/s as a cut-off yielded a sensitivity and specificity of 76.9% and 85.4%, respectively, in diagnosing mOM (area under the curve=0.851, 95% CI=0.774 to 0.928).
ARFI elastography is feasible in the omentum and may represent a good non-invasive additional tool in differentiating bOM from mOM.
The effects of food sensitivity can easily be masked by other digestive symptoms in ostomates and are unknown. We investigated food-specific-IgG presence in ostomates relative to participants affected by other digestive diseases.
Food-specific-IgG was evaluated for 198 participants with a panel of 109 foods. Immunocompetency status was also tested. Jejunostomates, ileostomates and colostomates were compared with individuals with digestive tract diseases with inflammatory components (periodontitis, eosinophilic esophagitis, duodenitis, ulcerative colitis, Crohn’s disease and appendicitis), as well as food malabsorption due to intolerance. A logistic regression model with covariates was used to estimate the effect of the experimental data and demographic characteristics on the likelihood of the immune response.
Jejunostomates and ileostomates had a significant risk of presenting circulating food-specific-IgG in contrast to colostomates (OR 12.70 (p=0.002), 6.19 (p=0.011) and 2.69 (p=0.22), respectively). Crohn’s disease, eosinophilic esophagitis and food malabsorption groups also showed significantly elevated risks (OR 4.67 (p=0.048), 8.16 (p=0.016) and 18.00 (p=0.003), respectively), but not the ulcerative colitis group (OR 2.05 (p=0.36)). Individuals with profoundly or significantly reduced, and mild to moderately reduced, levels of total IgG were protected from the formation of food-specific IgG (OR 0.09 (p=<0.001) and 0.33 (p=0.005), respectively). Males were at higher risk than females.
The strength of a subject’s immunocompetence plays a role in the intensity to which the humoral system responds via food-specific-IgG. An element of biogeography emerges in which the maintenance of a colonic space might influence the risk of having circulating food-specific-IgG in ostomates.
Oncology surgeons use animals and cadavers in training because of a lack of alternatives. The aim of this work was to develop a design methodology to create synthetic liver models familiar to surgeons, and to help plan, teach and rehearse patient-specific cancerous liver resection surgery.
Synthetic gels were selected and processed to recreate accurate anthropomorphic qualities. Organic and synthetic materials were mechanically tested with the same equipment and standards to determine physical properties like hardness, elastic modulus and viscoelasticity. Collected data were compared with published data on the human liver. Patient-specific CT data were segmented and reconstructed and additive manufactured models were made of the liver vasculature, parenchyma and lesion. Using toolmaking and dissolvable scaffolds, models were transformed into tactile duplicates that could mimic liver tissue behaviour.
Porcine liver tissue hardness was found to be 23 H00 (±0.1) and synthetic liver was 10 H00 (±2.3), while human parenchyma was reported as 15.06 H00 (±2.64). Average elastic Young’s modulus of human liver was reported as 0.012 MPa, and synthetic liver was 0.012 MPa, but warmed porcine parenchyma was 0.28 MPa. The final liver model demonstrated a time-dependant viscoelastic response to cyclic loading.
Synthetic liver was better than porcine liver at recreating the mechanical properties of living human liver. Warmed porcine liver was more brittle, less extensible and stiffer than both human and synthetic tissues. Qualitative surgical assessment of the model by a consultant liver surgeon showed vasculature was explorable and that bimanual palpation, organ delivery, transposition and organ slumping were analogous to human liver behaviour.
Despite the evidence for adverse pregnancy outcomes, non-alcoholic fatty liver disease (NAFLD) is not routinely addressed in early pregnancy. The Fatty Liver Index (FLI) has been proposed as a screening tool for NAFLD in the general population. We aim to develop mathematical models for predicting NAFLD in pregnancy and validate the FLI for first-trimester pregnant women.
Biochemical and biophysical parameters were analysed in pregnant women with period of gestation <12 weeks was done among Rajarata Pregnancy Cohort, Sri Lanka. Fatty liver was graded as (FLG) 0, I or II by ultrasound scan. Binary logistic regression models were employed to identify the factors predicting FLG-II. Six FLIs were developed to predict FLG-II. Validity of the FLIs was compared using the receiver operating characteristic curves.
The study sample consisted of 632 pregnant women with a mean age of 28.8 years (SD: 5.8 years). Age (OR: 1.6, 95% CI 1.1 to 2.3), body mass index (OR: 1.7, 95% CI 1.1 to 2.5) and gamma-glutamyl transferase levels (OR: 2.1, 95% CI 1.5 to 3.0) were the independent predictors of FLG-II. While the model with liver enzymes provided the best prediction of NAFLD (both FLG I and II) (area under the curve [(AUC]): –0.734), the highest AUC (0.84) for predicting FLG-II was observed with the full model (model with all parameters). The proposed budget model (AUC >0.81) is the best model for screening fatty liver in community health setup.
FLIs could be used as screening tools for NAFLD based on resource availability in different settings. External validation of the FLI and further investigation of the proposed FLI as a predictor of adverse pregnancy outcomes are recommended.
To investigate the prevalence and associated factors of persistent symptoms despite a strict gluten-free diet in adult patients with coeliac disease diagnosed in childhood.
Medical data on 239 currently adult patients with paediatric diagnosis were collected from patient records. Also, patients completed structured study questionnaire. All variables were compared between those with and without persistent symptoms.
Altogether 180 patients reported adhering to a strict gluten-free diet. Of these, 18% experienced persistent symptoms, including various gastrointestinal symptoms (73%), arthralgia (39%), fatigue (39%), skin symptoms (12%) and depression (6%). Those reporting persistent symptoms had more often gastrointestinal comorbidities (19% vs 6%, p=0.023), health concerns (30% vs 12%, p=0.006) and experiences of restrictions on daily life (64% vs 43%, p=0.028) than the asymptomatic subjects. The patients with symptoms had poorer general health (median score 13 vs 14, p=0.040) and vitality (15 vs 18, p=0.015) based on a validated Psychological General Well-Being Questionnaire and more severe symptoms on a Gastrointestinal Symptom Rating Scale scale (total score 2.1 vs 1.7, p<0.001). Except for general health, these differences remained significant after adjusting for comorbidities. The groups were comparable in current sociodemographic characteristics. Furthermore, none of the childhood features, including clinical, serological and histological presentation at diagnosis, and adherence and response to the diet after 6–24 months predicted symptom persistence in adulthood.
Almost one-fifth of adult patients diagnosed in childhood reported persistent symptoms despite a strict gluten-free diet. The ongoing symptoms were associated with health concerns and impaired quality of life.
Investigations were performed in a 50-year-old, otherwise healthy male with recurrent hiccups, in whom contractions persisted for up to 4 hours. Hiccups were initiated by drinking carbonated soda. The aerodigestive tract was visualised by video fluoroscopy. Hiccups were terminated by drinking a non-viscous contrast agent through a forced inspiratory suction and swallow tool. This device requires significant suction pressure (–100 mm Hg) to draw fluid into the mouth and is effective in approximately 90% of cases. The images were analysed together with concurrent audio recordings to gain insight into ‘what causes the ’hic’ in hiccups’ and how this commonplace but annoying problem can be treated.
]]>Appendectomy may modulate the risk of inflammatory bowel disease through an effect on the gut microbiota. This study investigated the associations between appendectomy and incidence of Crohn’s disease (CD) or ulcerative colitis (UC), with an emphasis on the influence of age and time post appendectomy.
This cohort study included 400 520 subjects born in Québec in 1970–1974 and followed until 2014. Administrative health data were used to ascertain appendectomy and cases of CD and UC. Cox proportional hazards models with time-dependent variables (appendectomy and time elapsed post appendectomy) allowed for the estimation of HRs and 95% CIs.
A total of 2545 (0.6%) CD cases and 1134 (0.3%) UC cases were identified during follow-up. Appendectomy increased the risk of CD (HR=2.02; 95% CI: 1.66 to 2.44), especially when performed at 18–29 years of age. The risk of CD was increased in the first 2 years, and decreased significantly after ≥15 years post appendectomy. Appendectomy appeared to protect against UC (HR=0.39; 95% CI: 0.22 to 0.71). The risk of UC was not associated with age at appendectomy, but decreased with time elapsed post appendectomy (HR=0.21; 95% CI: 0.06 to 0.72, comparing ≥5 with 0–4 years after appendectomy).
The increased risk of CD related to appendectomy in young adults may result from detection bias, but physicians should have a low threshold for suspicion of CD in young symptomatic adults with a history of appendectomy. A strong protective effect of appendectomy against UC was observed after 5 years.
There is substantial variation in colonoscopy use and evidence of long wait times for the procedure. Understanding the role of system-level resources in colonoscopy utilisation may point to a potential intervention target to improve colonoscopy use. This study characterises colonoscopy resource availability in Ontario, Canada and evaluates its relationship with colonoscopy utilisation.
We conducted a population-based study using administrative health data to describe regional variation in colonoscopy availability for Ontario residents (age 18–99) in 2013. We identified 43 colonoscopy networks in the province in which we described variations across three colonoscopy availability measures: colonoscopist density, private clinic access and distance to colonoscopy. We evaluated associations between colonoscopy resource availability and colonoscopy utilisation rates using Pearson correlation and log binomial regression, adjusting for age and sex.
There were 9.4 full-time equivalent colonoscopists per 100 000 Ontario residents (range across 43 networks 0.0 to 21.8); 29.5% of colonoscopies performed in the province were done in private clinics (range 1.2%–55.9%). The median distance to colonoscopy was 3.7 km, with 5.9% travelling at least 50 km. Lower colonoscopist density was correlated with lower colonoscopy utilisation rates (r=0.53, p<0.001). Colonoscopy utilisation rates were 4% lower in individuals travelling 50 to <200 km and 11% lower in individuals travelling ≥200 km to colonoscopy, compared to <10 km. There was no association between private clinic access and colonoscopy utilisation.
The substantial variations in colonoscopy resource availability and the relationship demonstrated between colonoscopy resource availability and use provides impetus for health service planners and decision-makers to address these potential inequalities in access in order to support the use of this medically necessary procedure.
Management of erosive oesophagitis (EE) remains suboptimal, with many patients experiencing incomplete healing, ongoing symptoms, and relapse despite proton pump inhibitor (PPI) treatment. The Study of Acid-Related Disorders investigated patient burden of individuals with EE in a real-world setting.
US gastroenterologists (GIs) or family physicians (FPs)/general practitioners (GPs) treating patients with EE completed a physician survey and enrolled up to four patients with EE for a patient survey, with prespecified data extracted from medical records.
102 GIs and 149 FPs/GPs completed the survey; data were available for 73 patients (mean age at diagnosis, 45.4 years). Omeprazole was healthcare professional (HCP)-preferred first-line treatment (60.8% GIs; 56.4% FPs/GPs), and pantoprazole preferred second line (29.4% and 32.9%, respectively). Price and insurance coverage (both 55.5% HCPs) and familiarity (47.9%) key drivers for omeprazole; insurance coverage (52.0%), price (50.0%), familiarity (48.0%), initial symptom relief (46.0%), and safety (44.0%) key drivers for pantoprazole. Only 49.3% patients took medication as instructed all the time; 56.8% independently increased medication frequency some of the time. Despite treatment, 57.5% patients experienced heartburn and 30.1% regurgitation; heartburn was the most bothersome symptom. 58.9% patients believed that their symptoms could be better controlled; only 28.3% HCPs were very satisfied with current treatment options. 83.6% patients wanted long-lasting treatment options. Fast symptom relief for patients was a top priority for 66.1% HCPs, while 56.6% would welcome alternatives to PPIs.
This real-world multicentre study highlights the need for new, rapidly acting treatments in EE that reduce symptom burden, offer durable healing and provide symptom control.
The Dietary Inflammatory Index (DII) is a documented nutritional tool for assessing diet-induced inflammation that has been linked to various diseases/outcomes. The association between DII and gallstone disease (GSD) is yet to be explored. The objective of this study was to examine the association between DII and GSD.
This cross-sectional study was conducted using the baseline phase data of the Dena PERSIAN cohort. The analysed data included demographic information, lifestyle variables, body mass index, diabetes and fatty liver history, and laboratory test results. The 113-item Food Frequency Questionnaire was used to estimate the dietary intake of participants and quantify the inflammatory potential of the individual’s diet. DII score was analysed as a continuous and quartiles variables. Univariable and multivariate logistic regressions were used to investigate the relationship between GSD and DII scores .
Out of 3626 individuals entering the study, 173 (4.77%) had GSD. The median DII was –0.08 (IQR=0.18). In the unadjusted model, the odds of having GSD were significantly higher in the first and second quartiles of DII (anti-inflammatory diet) than in higher quartiles (proinflammatory diet). In the adjusted model, the odds of having GSD in the third and fourth quartiles of DII scores compared with the first quartile were OR=0.59 (95% CI 0.36 to 0.95) and OR 0.51 (95% CI 0.30 to 0.84), respectively.
The results of this study suggest that a proinflammatory diet is associated with a reduced chance of GSD. However, longitudinal studies are needed to examine the causal association.
The growing importance of non-communicable diseases (NCDs) and high HIV prevalence in urban African settings may increase the burden of metabolic dysfunction-associated fatty liver disease (MAFLD). We assessed liver steatosis among HIV-positive and negative adults in urban Zambia.
Adults 30 years and older who were newly diagnosed with HIV, or tested HIV-negative at two primary care clinics in Lusaka, Zambia, were assessed for liver steatosis. Cardiometabolic data were collected through comprehensive clinical and laboratory assessments. Transient elastography was performed to measure controlled-attenuation parameter (≥248 dB/m). We used multivariable logistic regression models to determine the factors associated with the presence of steatosis.
We enrolled 381 patients, including 154 (40%) antiretroviral therapy-naïve people living with HIV (PLWH) with a median CD4+ count of 247 cells/mm3 and a mean body mass index (BMI) of 23.8 kg/m2. Liver steatosis was observed in 10% of participants overall and was more common among HIV-negative adults than in PLWH (15% vs 3%). The proportion of patients with steatosis was 25% among obese (BMI ≥30 kg/m2) participants, 12% among those overweight (BMI 25–29.9 kg/m2), and 7% among those with a BMI <25 kg/m2. Among patients with a fasting glucose ≥7 mmol/L or confirmed diabetes, 57% had liver steatosis. In multivariable analyses, HIV status (adjusted odds ratio (aOR) 0.18, 95% CI 0.06 to 0.53), confirmed diabetes or elevated fasting glucose (aOR 3.92, 95% CI 1.57 to 9.78) and elevated blood pressure (aOR 2.95, 95% CI 1.34 to 6.48) were associated with steatosis. The association between BMI>25 kg/m2 and liver steatosis was attenuated after adjustment for potential confounders (aOR 1.96, 95% CI 0.88–4.40). Overall, 21 (9%) participants without HIV and 4 (3%) with HIV met the criteria for MAFLD. Among individuals with liver steatosis, 65% (95% CI 49% to 80%) fulfilled criteria of MAFLD, whereas 15 (39%) of them had elevated transaminases and 3 (8%) F2–F4 fibrosis.
The prevalence of liver steatosis in this urban cohort of HIV-positive and negative adults in Zambia was low, despite a large proportion of patients with high BMI and central obesity. Our study is among the first to report data on MAFLD among adults in Africa, demonstrating that metabolic risk factors are key drivers of liver steatosis and supporting the adoption of the criteria for MAFLD in African populations.
Iron deficiency anaemia (IDA) in women aged 20–49 years may be caused by menses or gastrointestinal cancer. Data are sparse on the yield of endoscopy/colonoscopy in this population. Our aim was to determine the association of IDA and symptoms with cancers.
Retrospective cohort study within Kaiser Permanente Northern California. Participants were women aged 20–49 years tested for iron stores and anaemia during 1998, 2004 and 2010 and followed for 5 years for outcomes of oesophageal, gastric and colon cancers. Symptoms from the three prior years were grouped into dysphagia, upper gastrointestinal (UGI), lower gastrointestinal (LGI), rectal bleeding and weight loss.
Among 9783 anaemic women aged 20–49 years, there were no oesophageal, 6 gastric and 26 colon cancers. Incidences per 1000 for gastric cancer with and without iron deficiency (ID) were 0.60 (95% CI 0.23 to 1.55) and 0.63 (95% CI 0.17 to 2.31), and for colon cancer, 2.72 (95% CI 1.72 to 4.29) and 2.53 (95% CI 1.29 to 4.99). Endoscopies for UGI or dysphagia symptoms rather than bidirectional endoscopy for ID yielded more gastric cancers (n=5 and n=4, respectively) with fewer procedures (3793 instead of 6627). Colonoscopies for LGI or rectal bleed instead of for ID would detect more colon cancers (n=19 and n=18) with about 40% of the procedures (=2793/6627).
UGI and colon cancers were rare in women of menstruating age and when controlled for anaemia were as common without as with ID. Using symptoms rather than IDA as an indication for endoscopy found equal numbers of cancers with fewer procedures.
Although clinical guidelines exist, the diagnostic work-up for diagnosing inflammatory bowel disease (IBD) is complex and varies in clinical practice. This study used real-life data to characterise the current diagnostic procedures used to establish IBD diagnoses in a Danish nationwide setting.
Person-level data on patients diagnosed with IBD between 1 January 2014 and 30 June 2018 were linked between Danish health registers. Information on age, sex, registration of other gastrointestinal diseases, and diagnostic procedures (endoscopies, biopsies, and imaging) performed in relation to the first IBD hospital admission was analysed for the total study population and was stratified by IBD type, sex, and age.
The majority of the 12 871 patients with IBD included underwent endoscopy (84%), had a biopsy taken (84%), and/or underwent imaging procedures (44%). In total, 7.5% of the population (6% for Crohn’s disease and 8% for ulcerative colitis) were diagnosed with IBD despite not undergoing any of these diagnostic procedures. Patients with Crohn’s disease underwent more procedures than patients with ulcerative colitis (94% vs 92%, p<0.001). Children underwent slightly fewer diagnostic procedures than adults (92% vs 93%, p=0.004). Slightly more men underwent at least one procedure than women (92% vs 94%, p<0.001).
For 7.5% of patients with IBD, this study did not detect any registrations of the recommended diagnostic procedures for establishing an IBD diagnosis. Further research is needed to examine whether these findings are mainly explained by limitations of the register data or also indicate shortcomings of the general approach to IBD.
Biologic and small-molecule therapies have revolutionised the treatment of moderate-to-severe inflammatory bowel disease (IBD). A significant proportion of patients experience early or delayed treatment failure. Patients with IBD with greater visceral obesity are less likely to respond to biologics. Sarcopenia has been identified as a predictor of disease severity and need for rescue therapy in acute severe ulcerative colitis. The aim of this study is to assess the feasibility of a physician-derived exercise programme in patients with IBD commencing biologic or small-molecule therapy in addition to the effect on physical fitness, body composition and objective measures of quality of life, fatigue scores and disease activity.
This is a randomised controlled feasibility study comparing the effects of a physician-derived exercise programme and standard medical care (biologic/small-molecule therapy) with standard care alone in patients with moderate to severe IBD. Patients with IBD in the intervention group will undergo a structured exercise programme for 20 weeks. Both IBD groups will carry out body composition, disease activity and quality-of-life assessments at baseline, week 12 and week 26. The primary objective is to assess the feasibility of the physician-derived exercise programme in patients with IBD commencing disease-modifying therapies. Secondary endpoints include a change in cardiorespiratory fitness, disease activity/inflammation, fatigue, health-related quality of life outcomes and body composition between the two IBD groups. Exploratory endpoints include validation of anterior thigh ultrasound for sarcopenia screening, assessment of proinflammatory cytokines and markers of immunometabolism.
This study has received ethical approval from the Beaumont Hospital Ethics committee on 22 October 2021 (reference number 21/21). Data generated or analysed during this study will be published as an article and supplementary appendix in relevant medical journals. The data will also be presented at national and international conferences.
Percutaneous endoscopic gastrostomy (PEG) was developed by Ponsky-Gauderer in the early 1980s. These tubes are placed through the abdominal wall mainly to administer fluids, drugs and/or enteral nutrition but can also be used for drainage or decompression. The tubes consist of an internal and external retention device. It is a generally safe technique but major or minor complications may arise during and after tube placement.
A narrative review of the literature investigating minor complications after PEG placement.
This review was written from a clinical viewpoint focusing on prevention and management of minor complications and documented with real cases from more than 21 years of clinical practice.
Depending on the literature the incidence of minor complications after gastrostomy placement can be high. To decrease associated morbidity, prevention, early recognition and popper management of these complications are important.
We aimed to study the prevalence of achlorhydria (AC) in a large Asian population.
Medical records of patients who underwent oesophagogastroduodenoscopy (OGD) with Congo red staining method at the Vichaiyut Hospital from January 2010 to December 2019 were retrospectively reviewed.
A total of 3597 patients was recruited; 223 were excluded due to concurrent use of proton pump inhibitors. Eighteen from 3374 patients (0.53%) had AC. Seven patients were presented with permanent AC (5F, 2M) (median age=69 years; range 58–92). Among 11 patients with temporary AC (5M, 6F: mean age 73.4 years; SD 13.2 years), all had gastrointestinal Helicobacter pylori bacterial infection and were over 45 years old. After successful treatment for H. pylori, AC was absent among patients with temporary AC. If counting only patients over 45 years of age, the prevalence of AC was 0.68% (18/2614). No adverse events arising from Congo red occurred.
AC is relatively rare. Permanent and temporary AC were found only when they were over 55 and 45 years old, respectively. Staining Congo red on gastric mucosa can be safely and routinely incorporated into the OGD procedure for early detection of AC. We recommended a low-cost screening test such as serum vitamin B levels for screening only in patients aged 50 and over.
Despite international guidelines recommendations to use mortality as a quality criterion for gastrointestinal (GI) procedures, recent studies reporting these data are lacking. Our objective was to report death causes and rate following GI endoscopies in a tertiary university hospital.
We retrospectively reviewed all GI procedures made between January 2017 and December 2019 in our tertiary hospital in Switzerland. Data from patients who died within 30 days of the procedure were recorded.
Of 18 233 procedures, 251 patients died within 30 days following 345 (1.89%) procedures (244/9180 gastroscopies, 53/5826 colonoscopies, 23/2119 endoscopic ultrasound, 19/911 endoscopic retrograde cholangiopancreatography, 6/197 percutaneous endoscopic gastrostomies). Median age was 70 years (IQR 61–79) and 173/251 (68.92%) were male. Median Charlson Comorbidity Index was 5 (IQR 3–7), and 305/345 procedures (88.4%) were undertaken on patients with an ASA score ≥3. Most frequent indications were suspected GI bleeding (162/345; 46.96%) and suspected cancer or tumourous staging (50/345; 14.49%). Major causes of death were oncological progression (72/251; 28.68%), cardiopulmonary failure or cardiac arrest of unkown origin (62/251; 24,7%) and liver failure (20/251; 7.96%). No deaths were caused by complications such as perforation or bleeding.
Progression of malignancies unrelated to the procedure was the leading cause of short-term death following a GI procedure. After improvements in periprocedural care in the last decades, we should focus on patient selection in this era of new oncological and intensive care therapies. Death rate as a quality criterion is subject to caution as it depends on indication, setting and risk benefit ratio.
Gibraltar is a unique densely populated multicultural British Overseas Territory for which no population data on disorders of gut–brain interaction have existed.
We aimed to provide the first-ever assessment of prevalence of irritable bowel syndrome and functional dyspepsia in Gibraltar in relation to their diagnostic recognition and healthcare burden.
An internet survey was carried out in Gibraltar in 2019–2020. The study survey included demographic questions, the Rome IV diagnostic questions for functional dyspepsia and irritable bowel syndrome, relevant medical history, previous surgeries, medication use, healthcare visit frequency and a quality-of-life questionnaire.
888 individuals (3.5% of all Gibraltar adults) completed the survey anonymously. Irritable bowel syndrome prevalence was 5.2% (95% CI 3.7% to 6.6%). Functional dyspepsia prevalence was 9.9% (95% CI 7.9% to 11.9%). The two conditions overlapped substantially. Women had higher mean prevalence than men of both disorders. People meeting criteria for either or both disorders were prone to surgeries, had more frequent healthcare visits, higher medication use and lower quality-of-life scores compared with people without these disorders. Diagnostic recognition by healthcare providers was low, leaving 58.3% of irritable bowel syndrome and 96.9% of functional dyspepsia individuals undiagnosed.
This first-ever population-based study of Rome IV defined irritable bowel syndrome and functional dyspepsia in Gibraltar indicates that the prevalence rates of these disorders are similar to the recently reported data for the UK and Spain, but they remain poorly recognised despite substantially affecting the quality of life of individuals who have them in the Gibraltar community.
The gut microbiome and its metabolites are influenced by age and stress and reflect the metabolism and health of the immune system. We assessed the gut microbiota and faecal metabolome in a static animal model of non-alcoholic steatohepatitis (NASH).
This model was subjected to the following treatments: reverse osmosis water, AFO-202, N-163, AFO-202+N-163 and telmisartan treatment. Faecal samples were collected at 6 and 9 weeks of age. The gut microbiome was analysed using 16S ribosomal RNA sequences acquired by next-generation sequencing, and the faecal metabolome was analysed using gas chromatography-mass spectrometry.
Gut microbial diversity increased greatly in the AFO-202+N-163 group. Postintervention, the abundance of Firmicutes decreased, whereas that of Bacteroides increased and was the highest in the AFO-202+N-163 group. The decrease in the abundance of Enterobacteriaceae and other Firmicutes and the abundance of Turicibacter and Bilophila were the highest in the AFO-202 and N-163 groups, respectively. Lactobacillus abundance was highest in the AFO-202+N-163 group. The faecal metabolite spermidine, which is beneficial against inflammation and NASH, was significantly decreased (p=0.012) in the N-163 group. Succinic acid, which is beneficial in neurodevelopmental and neurodegenerative diseases, was increased in the AFO-202 group (p=0.06). The decrease in fructose was the highest in the N-163 group (p=0.0007). Isoleucine and Leucine decreased with statistical significance (p=0.004 and 0.012, respectively), and tryptophan also decreased (p=0.99), whereas ornithine, which is beneficial against chronic immune-metabolic-inflammatory pathologies, increased in the AFO-202+N-163 group.
AFO-202 treatment in mice is beneficial against neurodevelopmental and neurodegenerative diseases, and has prophylactic potential against metabolic conditions. N-163 treatment exerts anti-inflammatory effects against organ fibrosis and neuroinflammation. In combination, these compounds exhibit anticancer activity.
Second transplant centre opinions (STCOs) for patients declined for liver transplantation are infrequent. We aimed to identify STCOs outcomes from a tertiary transplant centre.
Referrals between 2012 and 2020 to Birmingham Unit were reviewed. Incoming: all referrals from out-of-region centres were collated. Outgoing: patients not listed in Birmingham were reviewed to identify referrals for STCOs to the other UK centres (A–F).
2535 patients were assessed for liver transplantation during the study period. Incoming: among 1751 referrals, 23 STCOs (17 unit A, 3 unit B, 1 unit C, 1 unit D and 1 unit E) were provided by Birmingham. Of the STCOs, 13/23 (57%) patients remained unsuitable for transplantation. Therefore, 10/23 (43%) underwent a second liver transplant assessment, of whom five (50%) were still deemed unsuitable, three (30%) listed (one transplanted) and two (20%) died preassessment. Outgoing: among 426 patients not listed, eight (1.8%) patients were referred for STCO (4 unit E, 2 unit B, 1 unit D, 1 unit A). Three (38%) were listed, two (25%) were assessed and declined, two (25%) were unsuitable for assessment and one (12.5%) died while waiting. Combining incoming and outgoing Birmingham STCOs (n=31), six (19%) of STCOs were listed in a second centre.
Second transplant centre opinions are rare with the majority still deemed unsuitable for liver transplantation. This highlights potential resource implications especially when undergoing a full second formal assessment. A streamlined STCO process with sharing of investigations and use of telemedicine in appropriate patients may allow for greater transparency, quicker decision making and less use of labour-intensive resources.
Immune checkpoint inhibitors (ICIs) can induce a wide range of immune-related adverse events (irAEs), potentially affecting any organ. ICI-induced colitis is a frequently reported irAE, whereas enteritis is rare and not well documented.
We are presenting a patient with metastatic melanoma who developed severe ICI-induced enterocolitis multirefractory for glucocorticoids, infliximab and vedolizumab, partially responding to faecal microbiota transplantation and final complete response to tofacitinib.
This case supports that tofacitinib may be an(other) effective agent in managing multirefractory ICI-induced diarrhoea caused by colitis and/or enteritis.
Physicians tend to focus on biomedical targets while little is known about issues important to patients. We aimed to identify critical concepts impacting patients with inflammatory bowel disease (IBD).
We performed a survey of patients with IBD in biologic therapy (n=172) and used a validated qualitative method called group concept mapping (GCM) in patient workshops. The survey included 13 questions on attitudes toward symptoms and issues related to IBD. In the eight workshops, patients (n=26) generated statements later clustered into concepts identifying issues impacting a patient’s life. Patients ranked the statements.
In the survey, patients’ mean age were 40 years (SD 13), 53% were women, and 38% had ulcerative colitis. They identified fatigue (57%) and stool frequency (46%) as the most critical symptoms impacting their daily lives regardless of disease activity. In the GCM workshops with Crohn’s disease (n=13) (median age 42 years (IQR 39–51) and 62% were women), 335 statements divided among 10 concepts were generated, and the three most important concepts were ‘Positive attitudes’, ‘Accept and recognition’, and ‘Sharing knowledge and experiences in life with Crohn’s disease’. In the workshops with ulcerative colitis (n=13) (median age 43 years (IQR 36–49) and 69% were women), 408 statements divided into 11 concepts were generated; the most important concepts were ‘Take responsibility and control over your life’, ‘Medication’, and ‘Everyday life with ulcerative colitis’.
Focusing solely on IBD symptoms, patients identified fatigue and stool frequency to impact daily life the most. However, when investigating the disease burden in a broader perspective beyond classic IBD symptoms, patients identified concepts with focus on emotional health to be most important.
The Copenhagen University Hospital, Herlev and Gentofte approved the questionnaire and methodology (work-zone no: 18015429).
Socioeconomic status is a risk factor for worse outcomes in many diseases. However, evidence on the association between socioeconomic status and clinical outcome in patients with ulcerative colitis (UC) is limited. In the clinical setting, the therapeutic goal for UC is to achieve mucosal healing (MH). Thus, the aim of this study is to examine the association between socioeconomic status and MH in patients with UC.
The study population consisted of 298 patients with UC. Education status and household income were divided into three groups based on a self-administered questionnaire. MH and complete MH were defined as a Mayo endoscopic subscore of 0–1 and 0, respectively. The association of socioeconomic status with MH and complete MH was assessed by multivariate logistic regression analysis. Patients with UC were divided into a younger group (<51 years old) and an older group (≥51 years old) based on median age.
The percentage of MH and complete MH was 62.4% and 25.2%, respectively. In all patients, socioeconomic status was not associated with MH and complete MH, respectively. In the older group, education but not household income was independently positively associated with MH and complete MH. In contrast, in the younger group, no association between socioeconomic status and MH and complete MH was found.
In older Japanese patients with UC, education status but not household income was independently positively associated with MH and complete MH.
This study aimed to evaluate the safety and tolerability of OP-724, a CREB-binding protein/β-catenin inhibitor, in patients with advanced primary biliary cholangitis (PBC).
An open-label, non-randomised, phase 1 trial was conducted at two hospitals in Japan. Patients with advanced PBC classified as stage III or higher according to the Scheuer classification by liver biopsy between 4 September 2019 and 21 September 2021 were enrolled. Seven patients received intravenous OP-724 infusions at escalating dosages of 280 and 380 mg/m2/4 hours two times weekly for 12 weeks. The primary endpoint was the incidence of serious adverse events (SAEs). The secondary endpoints were the incidence of AEs and the improvement in the modified Histological Activity Index (mHAI) score.
Seven patients (median age, 68 years) were enrolled. Of these seven patients, five completed twelve cycles of treatment, one discontinued prematurely for personal reasons in the 280 mg/m2/4 hours cohort, and one in the 380 mg/m2/4 hours cohort was withdrawn from the study due to drug-induced liver injury (grade 2). Consequently, the recommended dosage was determined to be 280 mg/m2/4 hours. SAEs did not occur. The most common AEs were abdominal discomfort (29%) and abnormal hepatic function (43%). OP-724 treatment was associated with histological improvements in the fibrosis stage (2/5 (40%)) and mHAI score (3/5 (60%)) on histological analysis.
Administration of intravenous OP-724 infusion at a dosage of 280 mg/m2/4 hours two times weekly for 12 weeks was well tolerated by patients with advanced PBC. However, further evaluation of antifibrotic effects in patients with PBC is warranted.
Hepatocyte nuclear factor 1B (HNF1B) is a member of the homeodomain-containing family of transcription factors located on 17q12. HNF1B deficiency is associated with a clinical syndrome (kidney and urogenital malformations, maturity-onset diabetes of the young, exocrine pancreatic insufficiency) and to an underdiagnosed liver involvement. Differently from HNF1A, the correlation between hepatocellular carcinoma (HCC) and germline HNF1B deficiency has been poorly evaluated.
Here, we report a novel case of a syndromic HNF1B-deficient paediatric patient that developed HCC with unique histopathological features characterised by neoplastic syncytial giant cells, which was observed only in one additional case of paediatric cholestatic liver disease of unknown origin.
Our case highlights the influence of HNF1B deficiency in liver disease progression and its putative association with a rare yet specific HCC histotype. We hypothesised that HCC could be secondary to the repressive effect of HNF1B variant on the HNF1A transcriptional activity.
Determine the variables associated with hospitalisations in patients with Crohn’s disease and those associated with surgery, intestinal resection, hospital readmission, need for multiple operations and immunobiological agent use.
A cross-sectional study was conducted from 2019 to 2021, using two centres for inflammatory bowel diseases in the Brazilian Public Health System.
This study included 220 patients. Only perianal disease was associated with hospitalisation (31.6% vs 13.0%, p=0.012). Stricturing or penetrating behaviour (35.8% vs 12.6%, p<0.001) and perianal disease (45.9% vs 9.9%, p<0.001) were associated with surgery. Ileal or ileocolonic location (80.0% vs 46.5%, p=0.044) and stricturing or penetrating behaviour (68.0% vs 11.2%, p<0.001) were associated with intestinal resection. Steroids use at first Crohn’s disease occurrence and postoperative complications were associated with hospital readmission and need for multiple operations, respectively. Age below 40 years at diagnosis (81.3% vs 62.0%, p=0.004), upper gastrointestinal tract involvement (21.8% vs 10.3%, p=0.040) and perianal disease (35.9% vs 16.3%, p<0.001) were associated with immunobiological agent use.
Perianal disease and stricturing or penetrating behaviour were associated with more than one significant outcome. Other variables related to Crohn’s disease progression were age below 40 years at diagnosis, an ileal or ileocolonic disease localisation, an upper gastrointestinal tract involvement, the use of steroids at the first Crohn’s disease occurrence and history of postoperative complications. These findings are similar to those in the countries with a high prevalence of Crohn’s disease.
The Bristol Stool Form Scale (BSFS) is the most widely used scale for stool form assessment. This study aimed to translate the BSFS into the Persian version and determine its content validity, face validity, and reliability.
Following permission, a forward–backward translation procedure was applied to translate the scale from English into Persian. A cross-sectional study was conducted on a sample of 210 participants from the general and gastrointestinal clinics of a teaching hospital affiliated with the Tehran University of Medical Sciences, Tehran, Iran, from January 2020 to August 2020. The samples were selected using convenience sampling. A group of 10 experts and 10 adults assessed content and face validity, respectively. The kappa index evaluated the reliability of the instruments.
Participants’ mean (±SD) age was 37.62 (±8.87) years. Most of the participants (65.7%) were women. The highest percentage of concordance was 100% for stool type 7, and stool type 5 had the lowest concordance percentage (78.1%). The overall kappa index was 0.79.
The Persian version of the BSFS is a valid and reliable measure for assessing stool form, and now it can be used in research and clinical practice.
Clinical delays may be important contributors to outcomes among younger adults (<50 years) with colorectal cancer (CRC). We aimed to describe delay intervals for younger adults with CRC using health administrative data to understand drivers of delay in this population.
This was a population-based study of adults <50 diagnosed with CRC in Ontario, Canada from 2003 to 2018. Using administrative code-based algorithms (including billing codes), we identified four time points along the pathway to treatment—first presentation with a CRC-related symptom, first investigation, diagnosis date and treatment start. Intervals between these time points were calculated. Multivariable quantile regression was performed to explore associations between patient and disease factors with the median length of each interval.
6853 patients aged 15–49 were diagnosed with CRC and met the inclusion criteria. Males comprised 52% of the cohort, the median age was 45 years (IQR 40–47), and 25% had stage IV disease. The median time from presentation to treatment start (overall interval) was 109 days (IQR 55–218). Time between presentation and first investigation was short (median 5 days), as was time between diagnosis and treatment start (median 23 days). The greatest component of delay occurred between first investigation and diagnosis (median 78 days). Women, patients with distal tumours, and patients with earlier stage disease had significantly longer overall intervals.
Some younger CRC patients experience prolonged times from presentation to treatment, and time between first investigation to diagnosis was an important contributor. Access to endoscopy may be a target for intervention.
Gut-directed hypnotherapy (GDH) is an evidence-based treatment for irritable bowel syndrome (IBS). Adoption of remote GDH has been accelerated by the COVID-19 pandemic. We aimed to evaluate patient experience and satisfaction following remote GDH.
On completing 12 sessions of remote GDH via Skype using the Manchester protocol, patients with refractory IBS completed a feedback form on their experience. The proportion reporting positive outcomes (≥30% improvement in global IBS symptoms or abdominal pain, satisfaction, recommendation to family/friends) were compared by patient factors (age, gender, proximity, preferences).
Of 52 patients completing the feedback form, 27 (52%) indicated that they would have opted for remote over face-to-face GDH, regardless of the pandemic situation. On a five-point scale (5=easy), patients rated the platform easy-to-use (mean 4.5±0.8) without impairment of communication (mean rating 4.6±0.8). Following remote GDH, 30/52 (58%) reported ≥30% global IBS symptom improvement, and 24/52 (46%) reported ≥30% pain reduction. 90% would recommend remote GDH to others. Only 39% felt they would have benefitted more from face to face. Those who would have chosen remote GDH regardless of the pandemic were more likely to be satisfied (p=0.01). Age, gender and proximity did not influence outcomes, satisfaction and likelihood of recommending remote GDH to others. Difficulties during remote sessions were infrequent in both those that were satisfied, and those that would have preferred face to face.
These data support the need to continue developing remote GDH in the post-COVID era but suggest that there is still a role for face-to-face GDH, with patient choice being an important factor.
To assess the inverse relationship between acute appendicitis and ulcerative colitis (UC) using a sibling comparison design to adjust for unmeasured familial genetic and environmental factors.
The cohort comprised 3.1 million individuals resident in Sweden between 1984 and 2018 with the linkage of several Swedish national registers. Fitting Cox hazards models, we calculated the risk for developing UC in individuals with and without acute appendicitis by the age 20 years adjusting for several potential confounding factors. Further, we performed sibling-stratified analyses to adjust for shared unmeasured familial confounding factors.
During 57.7 million person-years of follow-up, 20 848/3 125 232 developed UC among those without appendicitis (3.63 (3.59–3.68) per 10 000 person-years), whereas only 59/35 848 people developed UC among those with appendicitis before age 20 years (1.66 (1.28–2.14) per 10 000 person-years). We found a decreased risk for developing UC in those with acute appendicitis by the age 20 years compared with individuals who did not have appendicitis by this age (HR=0.37 (95% CI 0.29 to 0.48)). When adjusting for shared familial confounders, we observed only a slight attenuation in this association (HR=0.46 (95% CI 0.32 to 0.66)).
Individuals who had acute appendicitis by late adolescence showed a decreased risk for developing UC compared with those who did not. Genetic and shared familial environmental factors seem to potentially play only a small role in this relationship. Our results suggest an independent association of acute appendicitis, or its underlying causes, with UC risk.
Hepatic encephalopathy (HE) is a debilitating symptom of end-stage liver disease (ESLD), but there remains a paucity of evidence regarding its impact on nutritional status, nutritional intake, compliance with nutritional support and resultant muscle health and function. Malnutrition and sarcopenia are associated with increased morbidity and mortality in patients with ESLD. The aim of the current case–control study is to prospectively investigate the impact of HE on nutritional intake and sarcopenia status in patients with ESLD.
Patients with ESLD, with HE (n=10) and without HE (n=10) will be recruited at the outpatient liver unit, University Hospital Birmingham, UK. All patients will undergo clinical assessment at baseline and again at 6–8 weeks (in-line with their routine clinical follow-up), to assess the impact of HE on reported nutritional intake, nutritional status and sarcopenia/physical functional status. Standard medical, dietetic and home-based exercise physiotherapy care will continue for all participants as determined by their clinical team. Two methods of assessing nutritional intake will include the 24-hour food recall and 3-day food diaries. Assessment of sarcopenia status will be undertaken using anthropometry (mid-arm muscle circumference (MAMC)) and ultrasound imaging of the quadriceps muscle group. Markers of physical function (hand grip strength; chair rise time), frailty (Liver Frailty Index (LFI)), physical activity (accelerometery) and exercise capacity (Duke Activity Status Index (DASI)) will be assessed at both clinic visits.
The study is approved by Wales Research Ethics Committee 2 and Health Research Authority (REC reference: 21/WA/0216). Recruitment into the study commenced November 2021. The findings will be disseminated through peer-reviewed publications and international presentations.
RRK7156.
Oesophageal cancer remains a common cause of cancer mortality worldwide. Increasingly, oncology centres are treating an older population and comorbidities may preclude multimodality treatment with chemoradiotherapy (CRT). We review outcomes of radical radiotherapy (RT) in an older population treating squamous cell carcinoma (SCC) oesophagus.
Patients over 65 years receiving RT for SCC oesophagus between 2013 and 2016 in the West of Scotland were identified. Kaplan-Meier and Cox-regression analysis were used to compare overall survival (OS) between patients treated with radical RT and radical CRT.
There were 83 patients over 65 years treated with either RT (n=21) or CRT (n=62). There was no significant difference in median OS between CRT versus RT (26.8 months vs 28.5 months, p=0.92). All patients receiving RT completed their treatment whereas 11% of CRT patients did not complete treatment.
Survival in this non-trial older patient group managed with CRT is comparable to that reported in previous trials. RT shows better than expected outcomes which may reflect developments in RT technique. This review supports RT as an alternative in older patients, unfit for concurrent treatment.
Severe acute pancreatitis (SAP) is associated with high mortality (15%–30%). Current guidelines recommend these patients are best managed in a multidisciplinary team setting. This study reports experience in the management of SAP within the UK’s first reported hub-and-spoke pancreatitis network.
All patients with SAP referred to the remote care pancreatitis network between 2015 and 2017 were prospectively entered onto a database by a dedicated pancreatitis specialist nurse. Baseline characteristics, aetiology, intensive care unit (ICU) stay, interventions, complications, mortality and follow-up were analysed.
285 patients admitted with SAP to secondary care hospitals during the study period were discussed with the dedicated pancreatitis specialist nurse and referred to the regional service. 83/285 patients (29%; 37 male) were transferred to the specialist centre mainly for drainage of infected pancreatic fluid collections (PFC) in 95% (n=79) of patients. Among the patients transferred; 29 (35%) patients developed multiorgan failure with an inpatient mortality of 14% (n=12/83). The median follow-up was 18.2 months (IQR=11.25–35.51). Multivariate analysis showed that transferred patients had statistically significant longer overall hospital stay (p<0.001) but less ICU stay (p<0.012).
This hub-and-spoke model facilitates the management of the majority of patients with SAP in secondary care setting. 29% warranted transfer to our tertiary centre, predominantly for endoscopic drainage of PFCs. An evidence-based approach with a low threshold for transfer to tertiary care centre can result in lower mortality for SAP and fewer days in ICU.
To assess outcomes in patients with iron-deficient inflammatory bowel disease (IBD) treated with ferric maltol in UK real-world practice.
This observational, multicentre, retrospective cohort study included adults with IBD and iron-deficiency anaemia (IDA; haemoglobin ≥95 to <120 g/L (women) or ≥95 to <130 g/L (men) plus serum ferritin <30 µg/L or transferrin saturation <20%) who received ferric maltol. Data were extracted from patient records. The primary analysis was the proportion of patients with normalised haemoglobin (≥120 g/L (women); ≥130 g/L (men)) over 12 weeks. Iron indices and safety were assessed.
Thirty of 59 patients had data for the primary outcome, 19 of whom (63%) achieved haemoglobin normalisation at week 12. Mean±SD haemoglobin was 127±16 g/L at week 12 (increase of 14±17 g/L from baseline). Overall, 27 patients achieved haemoglobin normalisation by the end of the observation period; mean±SD time to normalisation was 49.5±25.6 days. Nine of 17 patients had normalised serum ferritin (30–300 µg/L) at week 12, and 16 patients had normalised ferritin at the end of the observation period; mean±SD time to normalisation was 71.3±27.6 days. Twenty-four adverse events occurred in 19 patients (32%); most frequent adverse events were abdominal pain or discomfort (n=9) and constipation (n=3).
Ferric maltol increases haemoglobin and iron indices and is generally well tolerated in patients with IBD and IDA treated in clinical practice. These real-world data support findings from randomised controlled trials.
To determine the diagnostic yield of EUS in IARP.
A retrospective study was performed in patients with IARP evaluated by EUS between January 2009 and December 2016. Follow-up assessments of acute pancreatitis recurrence were carried out.
Seventy-three patients with 102 EUS procedures were included. EUS was able to identify the cause of IARP in 55 patients (75.3%). The most common findings were chronic pancreatitis in 27 patients (49.1%), followed by lithiasic pathology in 24 patients (43.6%), and intraductal papillary mucinous neoplasm in four patients (7.3%). A directed treatment against EUS findings had a protective tendency associated with the final resolution of recurrence. There were no complications reported.
EUS performed in patients with IARP helped to identify a possible cause in 2/3 of the cases. The majority of patients have a treatable disease.
This study aimed to assess if there is secondary care medical inertia towards coeliac disease (CD).
Group (1): Time from primary care presentation to diagnostic endoscopy was quantified in 151 adult patients with a positive endomysial antibody test and compared with 92 adult patients with histologically proven inflammatory bowel disease (IBD). Group (2): Across four hospitals, duodenal biopsy reports for suspected CD were reviewed (n=1423). Group (3): Clinical complexity was compared between known CD (n=102) and IBD (n=99) patients at their respective follow-up clinic appointments. Group (4): 50 gastroenterologists were questioned about their perspective on CD and IBD.
Group (1): Suspected coeliac patients waited significantly longer for diagnostic endoscopy following referral (48.5 (28–89) days) than suspected patients with IBD (34.5 (18–70) days; p=0.003). Group (2): 1423 patients underwent diagnostic endoscopy for possible CD, with only 40.0% meeting guidelines to take four biopsies. Increased diagnosis of CD occurred if guidelines were followed (10.1% vs 4.6% p<0.0001). 12.4% of newly diagnosed CD patients had at least one non-diagnostic gastroscopy in the 5 years prior to diagnosis. Group (4): 32.0% of gastroenterologists failed to identify that CD has greater prevalence in adults than IBD. Moreover, 36.0% of gastroenterologists felt that doctors were not required for the management of CD.
Prolonged waiting times for endoscopy and inadequacies in biopsy technique were demonstrated suggesting medical inertia towards CD. However, this has to be balanced against rationalising care accordingly. A Coeliac UK National Patient Charter may standardise care across the UK.
The primary aim is to provide a summary of evidence for the diagnostic accuracies of multiplex PCR gastrointestinal (GI) panels—BioFire FilmArray and Luminex xTAG on the detection of gastroenteritis pathogens. The secondary aim is to compare the performance of these GI panels head to head.
A comprehensive search up to 1 December 2019 was conducted on PubMed, Embase, Ovid Medline and Web of Science for studies that used FilmArray or Luminex xTAG Gastrointestinal Pathogen Panel (GPP) for diagnosis of acute gastroenteritis. A summary of diagnostic accuracies for the 16 pathogens were calculated by comparing the GI panels to the current gold standards (conventional standard microbiology techniques such as culture or PCR for bacteria, PCR or enzyme immunoassay (EIA) for viruses, microscopy or EIA for parasite). Hierarchical summary receiver operating characteristic (HSROC) curve analysis, pretest and post-test probabilities were used for estimating the pathogen detection performance.
A total of 11 studies with 7085 stool samples were eligible for analysis. Multiplex PCRs demonstrated high diagnostic accuracy, with specificity greater double equals0.98 and area under the ROC curve (AUROC) greater double equals0.97 for all the pathogens except for Yersinia enterocolitica (AUROC 0.91). The FilmArray panel demonstrated a higher sensitivity than xTAG GPP for most of the pathogens with the exception of Rotavirus A (xTAG GPP and FilmArray were both 0.93).
This is the first meta-analysis that is a head-to-head comparison examining the performance of the novel multiplex PCR-based tests Luminex xTAG GPP and FilmArray GI panel in detecting each pathogen. Point estimates calculated from eligible studies showed that both GI panels are highly accurate and may provide important diagnostic information for early identification of gastroenteritis. In addition, although FilmArray has higher sensitivity and post-test probability than xTAG GPP for most of the pathogens, how this will translate to a clinical setting remains unclear.
Digestive endoscopy is considered a high-risk procedure for COVID-19. Recommendations have been made for its practice during the pandemic. This study was conducted to determine adherence to recommendations for endoscopy practice during the COVID-19 pandemic in Latin America (LA).
A survey was conducted of endoscopists from LA consisting of 43 questions for the evaluation of four items: general and sociodemographic features, and preprocedure, intraprocedure and postprocedure aspects.
A response was obtained from 338 endoscopists (response rate 34.5%) across 15 countries in LA. In preprocedure aspects (hand washing, use of face masks for patients, respiratory triage area, training for the placement/removal of personal protective equipment (PPE) and availability of specific area for the placement/removal of PPE), there was adherence in <75%. Regarding postprocedure aspects, 77% (261/338) had reused PPE, mainly the N95 respirator or higher, and this was with a standardised decontamination procedure only in 32% (108/338) of the time. Postprocedure room decontamination was carried out by 47% on >75% of occasions. In relationship to intraprocedure aspects (knowledge of risk and type of endoscopic procedures, use of PPE, airway management in patients and infrastructure), there was adherence in >75% for all the parameters and 78% of endoscopists only performed emergencies or time-sensitive procedures.
Adherence to the recommendations for endoscopy practice during the COVID-19 pandemic is adequate in the intraprocedure aspect. However, it is deficient in the preprocedure and postprocedure aspects.
Post-endoscopic retrograde cholangiopancreatography (ERCP) pancreatitis (PEP) is a complication associated with important morbidity, occasional mortality and high costs. Preventive strategies are suboptimal as PEP continues to affect 4% to 9% of patients. Spraying epinephrine on the papilla may decrease oedema and prevent PEP. This study aimed to compare rectal indomethacin plus epinephrine (EI) versus rectal indomethacin plus sterile water (WI) for the prevention of PEP.
This multicentre randomised controlled trial included patients aged >18 years with an indication for ERCP and naive major papilla. All patients received 100 mg of rectal indomethacin and 10 mL of sterile water or a 1:10 000 epinephrine dilution. Patients were asked about PEP symptoms via telephone 24 hours and 7 days after the procedure. The trial was stopped half way through after a new publication reported an increased incidence of PEP among patients receiving epinephrine.
Of the 3602 patients deemed eligible, 3054 were excluded after screening. The remaining 548 patients were randomised to EI group (n=275) or WI group (n=273). The EI and WI groups had similar baseline characteristics. Patients in the EI group had a similar incidence of PEP to those in the WI group (3.6% (10/275) vs 5.12% (14/273), p=0.41). Pancreatic duct guidewire insertion was identified as a risk factor for PEP (OR 4.38, 95% CI (1.44 to 13.29), p=0.009).
Spraying epinephrine on the papilla was no more effective than rectal indomethacin alone for the prevention of PEP.
This study was registered with ClinicalTrials.gov (NCT02959112).
Proton pump inhibitor (PPI) use has risen substantially, primarily driven by ongoing use over months to years. However, there is no consensus on how to define long-term PPI use. Our objectives were to review and compare definitions of long-term PPI use in existing literature and describe the rationale for each definition. Moreover, we aimed to suggest generally applicable definitions for research and clinical use.
The databases PubMed and Cochrane Library were searched for publications concerning long-term use of PPIs and ClinicalTrials.gov was searched for registered studies. Two reviewers independently screened the titles, abstracts, and full texts in two series and subsequently extracted data.
A total of 742 studies were identified, and 59 met the eligibility criteria. In addition, two ongoing studies were identified. The definition of long-term PPI use varied from >2 weeks to >7 years. The most common definition was ≥1 year or ≥6 months. A total of 12/61 (20%) of the studies rationalised their definition.
The definitions of long-term PPI treatment varied substantially between studies and were seldom rationalised.
In a clinical context, use of PPI for more than 8 weeks could be a reasonable definition of long-term use in patients with reflux symptoms and more than 4 weeks in patients with dyspepsia or peptic ulcer. For research purposes, 6 months could be a possible definition in pharmacoepidemiological studies, whereas studies of adverse effects may require a tailored definition depending on the necessary exposure time. We recommend to always rationalise the choice of definition.
To determine the prevalence, risk factors and natural history of hiatal hernia (HH) on CT in the general population.
The Multi-Ethnic Study of Atherosclerosis (MESA) acquired full-lung CT on 3200 subjects, aged 53–94 years. Three blinded observers independently determined presence/absence and type (I–IV) of HH. Associations between HH and participant characteristics were assessed via unadjusted and multivariable-adjusted relative risk regression. HH natural history was assessed compared with prior MESA CT.
Excellent interobserver agreement was found for presence (
HH on non-contrast CT is prevalent in the general population, increasing with age, female gender and BMI. Its association with proton pump inhibitor use confirms a role in gastro-oesophageal reflux disease and HH progression is associated with increased BMI.
Patients infected with the SARS-CoV-2 usually report fever and respiratory symptoms. However, multiple gastrointestinal (GI) manifestations such as diarrhoea and abdominal pain have been described. The aim of this study was to evaluate the prevalence of GI symptoms, elevated liver enzymes and mortality of patients with COVID-19.
A systematic review and meta-analysis of published studies that included a cohort of patients infected with SARS-CoV-2 were performed from 1 December 2019 to 15 December 2020. Data were collected by conducting a literature search using PubMed, Embase, Scopus, and Cochrane according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. We analysed pooled data on the prevalence of individual GI symptoms and elevated liver enzymes and performed subanalyses to investigate the relationship between GI symptoms/elevated liver enzymes, geographical location, mortality, and intensive care unit (ICU) admission.
The available data of 78 798 patients positive for SARS-CoV-2 from 158 studies were included in our analysis. The most frequent manifestations were diarrhoea (16.5%, 95% CI 14.2% to 18.4%), nausea (9.7%, 95% CI 9.0% to 13.2%) and elevated liver enzymes (5.6%, 95% CI 4.2% to 9.1%). The overall mortality and GI mortality were 23.5% (95% CI 21.2% to 26.1%) and 3.5% (95% CI 3.1% to 6.2%), respectively. Subgroup analysis showed non-statistically significant associations between GI symptoms/elevated liver enzymes and ICU admissions (OR=1.01, 95% CI 0.55 to 1.83). The GI mortality was 0.9% (95% CI 0.5% to 2.2%) in China and 10.8% (95% CI 7.8% to 11.3%) in the USA.
GI symptoms/elevated liver enzymes are common in patients with COVID-19. Our subanalyses showed that the presence of GI symptoms/elevated liver enzymes does not appear to affect mortality or ICU admission rate. Furthermore, the proportion of GI mortality among patients infected with SARS-CoV-2 varied based on geographical location.
The prevalence of non-alcoholic fatty liver disease (NAFLD) and non-alcoholic steatohepatitis (NASH) cirrhosis is often underestimated in healthcare and administrative databases that define disease burden using International Classification of Diseases (ICD) codes. This retrospective audit was conducted to explore the accuracy and limitations of the ICD, Tenth Revision, Australian Modification (ICD-10-AM) to detect NAFLD, metabolic risk factors (obesity and diabetes) and other aetiologies of chronic liver disease.
ICD-10-AM codes in 308 admitted patient encounters at two major Australian tertiary hospitals were compared with data abstracted from patients’ electronic medical records. Accuracy of individual codes and grouped combinations was determined by calculating sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV) and Cohen’s kappa coefficient ().
The presence of an ICD-10-AM code accurately predicted the presence of NAFLD/NASH (PPV 91.2%) and obesity (PPV 91.6%) in most instances. However, codes underestimated the prevalence of NAFLD/NASH and obesity by 42.9% and 45.3%, respectively. Overall concordance between clinical documentation and ‘grouped alcohol’ codes ( 0.75) and hepatitis C codes ( 0.88) was high. Hepatitis B codes detected false-positive cases in patients with previous exposure (PPV 55.6%). Accuracy of codes to detect diabetes was excellent (sensitivity 95.8%; specificity 97.6%; PPV 94.9%; NPV 98.1%) with almost perfect concordance between codes and documentation in medical records ( 0.93).
Recognition of the utility and limitations of ICD-10-AM codes to study the burden of NAFLD/NASH cirrhosis is imperative to inform public health strategies and appropriate investment of resources to manage this burgeoning chronic disease.
To evaluate the risk of common infections in individuals with inflammatory bowel disease (IBD) [ulcerative colitis and Crohn’s disease] compared with matched controls in a contemporary UK primary care population.
Matched cohort analysis (2014–2019) using the Royal College of General Practitioners Research and Surveillance Centre primary care database. Risk of common infections, viral infections and gastrointestinal infections (including a subset of culture-confirmed infections), and predictors of common infections, were evaluated using multivariable Cox proportional hazards models.
18 829 people with IBD were matched to 73 316 controls. People with IBD were more likely to present to primary care with a common infection over the study period (46% vs 37% of controls). Risks of common infections, viral infections and gastrointestinal infections (including stool culture-confirmed infections) were increased for people with ulcerative colitis and Crohn’s disease compared with matched controls (HR range 1.12–1.83, all p<0.001). Treatment with oral glucocorticoid therapy, immunotherapies and biologic therapy, but not with aminosalicylates, was associated with increased infection risk in people with IBD. Despite mild lymphopenia and neutropenia being more common in people with IBD (18.4% and 1.9%, respectively) than in controls (6.5% and 1.5%, respectively), these factors were not associated with significantly increased infection risk in people with IBD.
People with IBD are more likely to present with a wide range of common infections. Health professionals and people with IBD should remain vigilant for infections, particularly when using systemic corticosteroids, immunotherapies or biologic agents.
Clinicaltrials.gov (NCT03836612).
Although evidence suggests frequent gastrointestinal (GI) involvement during coronavirus disease 2019 (COVID-19), endoscopic findings are scarcely reported.
We aimed at registering endoscopic abnormalities and potentially associated risk factors among patients with COVID-19.
All consecutive patients with COVID-19 undergoing endoscopy in 16 institutions from high-prevalence regions were enrolled. Mann-Whitney U, 2 or Fisher’s exact test were used to compare patients with major abnormalities to those with negative procedures, and multivariate logistic regression to identify independent predictors.
Between February and May 2020, during the first pandemic outbreak with severely restricted endoscopy activity, 114 endoscopies on 106 patients with COVID-19 were performed in 16 institutions (men=70.8%, median age=68 (58–74); 33% admitted in intensive care unit; 44.4% reporting GI symptoms). 66.7% endoscopies were urgent, mainly for overt GI bleeding. 52 (45.6%) patients had major abnormalities, whereas 13 bled from previous conditions. The most prevalent upper GI abnormalities were ulcers (25.3%), erosive/ulcerative gastro-duodenopathy (16.1%) and petechial/haemorrhagic gastropathy (9.2%). Among lower GI endoscopies, 33.3% showed an ischaemic-like colitis.
Receiver operating curve analysis identified D-dimers >1850 ng/mL as predicting major abnormalities. Only D-dimers >1850 ng/mL (OR=12.12 (1.69–86.87)) and presence of GI symptoms (OR=6.17 (1.13–33.67)) were independently associated with major abnormalities at multivariate analysis.
In this highly selected cohort of hospitalised patients with COVID-19 requiring endoscopy, almost half showed acute mucosal injuries and more than one-third of lower GI endoscopies had features of ischaemic colitis. Among the hospitalisation-related and patient-related variables evaluated in this study, D-dimers above 1850 ng/mL was the most useful at predicting major mucosal abnormalities at endoscopy.
ClinicalTrial.gov (ID: NCT04318366).
Fatigue is the most commonly reported symptom of the liver disease primary biliary cholangitis (PBC). It affects 40%–80% of patients, has no effective treatment and is associated with heightened mortality risk. The pathogenesis is unknown, but muscle bioenergetic abnormalities have been proposed to contribute. Directly observed exercise has been shown to attenuate symptoms in small groups; however, due to the rare nature of the disease, home-based interventions need to be evaluated for feasibility, safety and efficacy.
This is a phase 1/pilot, single-arm, open-label clinical trial evaluating a novel home-based exercise programme in patients with PBC with severe fatigue. Forty patients with moderate-severe fatigue (PBC40 fatigue domain score >33; other causes of fatigue excluded) will be selected using a convenience sampling method. A 12-week home-based exercise programme, consisting of individualised resistance, aerobic exercises and telephone health calls (first 6 weeks only), will be delivered. Measures of fatigue (PBC40 fatigue domain; fatigue impact scale), quality of life, sleep (Epworth Sleep Score), physical activity, anxiety and depression, aerobic exercise capacity (incremental shuttle walk test; Duke Activity Status Index) and functional capacity (short physical performance battery) will be assessed at baseline and at 6 and 12 weeks following the intervention.
The protocol is approved by the National Research Ethics Service Committee London (IRAS 253115). Recruitment commenced in April 2019 and ended in March 2020. Participant follow-up is due to finish by December 2020. Findings will be disseminated through peer-reviewed publication, conference presentation and social media.
The global COVID-19 pandemic has impacted on the mental health of individuals, particularly those with chronic illnesses. We aimed to quantify stress, anxiety and depression among individuals with Inflammatory bowel disease (IBD) in Australia during the pandemic.
An electronic survey was made available to IBD patients Australia-wide from 17 June to 12 July 2020. Respondents with an underlying diagnosis of IBD and over 18 years of age were included. A validated questionnaire (Depression, Anxiety, Stress Score-21, DASS21) was used to assess depression, anxiety and stress. Data on potential predictors of depression, anxiety and stress were collected.
352 participated in the survey across Australia. 60.5% of respondents fulfilled DASS criteria for at least moderate depression, anxiety or stress. 45% reported a pre-existing diagnosis of depression and/or anxiety. Over 2/3 of these respondents reported worsening of their pre-existing depression/anxiety due to the current pandemic. Of those without a pre-existing diagnosis of anxiety or depression, high rates of at least moderate to severe depression (34.9%), anxiety (32.0%) and stress (29.7%) were noted. Younger age (OR 0.96, 95% CI 0.94 to 0.98, p<0.001), lack of access to an IBD nurse (OR 1.81, 95% CI 1.03 to 3.19, p=0.04) and lack of education on reducing infection risk (OR 1.99, 95% CI 1.13 to 3.50, p=0.017) were associated with significant stress, anxiety and/or depression.
High prevalence of undiagnosed depression, anxiety and stress was identified among respondents. Improved access to IBD nurse support and greater attention to education are modifiable factors that may reduce depression, anxiety and/or stress among patients with IBD during the pandemic.
Animal studies indicate a potential protective role of antidepressant medication (ADM) in models of colitis but the effect of their use in humans with ulcerative colitis (UC) remains unclear.
To study the relationship between ADM use and corticosteroid dependency in UC.
Using the Clinical Practice Research Datalink we identified patients diagnosed with UC between 2005 and 2016. We grouped patients according to serotonin selective reuptake inhibitor (SSRI) and tricyclic antidepressant (TCA) exposure in the 3 years following diagnosis: ‘continuous users’, ‘intermittent users’ and ‘non-users’. We used logistic regression to estimate the adjusted risk of corticosteroid dependency between ADM exposure groups.
We identified 6373 patients with UC. Five thousand two hundred and thirty (82%) use no ADMs, 627 (10%) were intermittent SSRI users and 282 (4%) were continuous SSRI users, 246 (4%) were intermittent TCA users and 63 (1%) were continuous TCA users.
Corticosteroid dependency was more frequent in continuous SSRI and TCA users compared with non-users (19% vs 24% vs 14%, respectively, 2 p=0.002). Intermittent SSRI and TCA users had similar risks of developing corticosteroid dependency to non-users (SSRI: OR 1.19, 95% CI 0.95 to 1.50, TCA: OR 1.14, 95% CI 0.78 to 1.66). Continuous users of both SSRIs and TCAs had significantly higher risks of corticosteroid dependency compared with non-users (SSRI: OR 1.62, 95% CI 1.15 to 2.27, TCA: OR 2.02, 95% CI 1.07 to 3.81).
Continuous ADM exposure has no protective effect in routine clinical practice in UC and identifies a population of patients requiring more intensive medical therapy. ADM use is a flag for potentially worse clinical outcomes in UC.
This study will test the performance of the anal swab PCR test when compared with the nasopharyngeal swab PCR test as a diagnostic tool for COVID-19.
An observational descriptive study which included hospitalised suspected, or probable cases of hopitalised COVID-19 patients, conducted in Dr. Cipto Mangunkusumo National Hospital, Ciputra Hospital, Mitra Keluarga Depok Hospital and Mitra Keluarga Kelapa Gading Hospital, Indonesia. Epidemiological, clinical, laboratory and radiology data were obtained. Nasopharyngeal and anal swabs specimens were collected for SARS-CoV-2 RNA detection.
We analysed 136 subjects as part of this study. The clinical spectrum of COVID-19 manifesation in this study was typical of hospitalised patients, with 25% classified as mild cases, 14.7% in severe condition and 12.5% of subjects classified as having acute respiratory distress syndrome. When compared with nasopharyngeal swab as the standard specimen for reverse transcription polymerase chain reaction (RT-PCR) detection of SARS-CoV-2 antigen, the sensitivity and specificity of the anal swab was 36.7% and 93.8%, respectively. The positive and negative predictive value were 97.8% and 16.5 %, respectively. The performance of the anal swab remained similar when only the subgroup of patients with gastrointestinal symptoms (n=92, 67.6%) was analysed (sensitivity 40% and specificity 91.7%). Out of all the subjects included in analysis, 67.6% had gastrointestinal symptoms. Similarly, 73.3% of patients in the anal swab-positive group had gastrointestinal symptoms. The two most common gastrointestinal symptoms in the subjects’ population were nausea and anorexia.
Anal swab specimen has low sensitivity (36.7%) but high specificity (93.8%) for detecting SARS-CoV-2 antigen by RT-PCR. Only one additional positive result was found by anal swab among the nasopharyngeal swab-negative group. Anal swab may not be needed as an additional test at the beginning of a patient’s diagnostic investigation and nasopharyngeal swab RT-PCR remains as the standard diagnostic test for COVID-19.
Hepatitis C virus (HCV) infection is associated with an increased risk of cardiovascular disease (CVD) and reduced health-related quality of life (HRQoL). Although physical activity (PA)/exercise has been shown to reduce CVD risk and improve HRQoL in patients with liver disease, there is limited data in HCV. We aimed to explore the association between PA/exercise levels, CVD risk and HRQoL in patients with HCV and assess individuals’ attitudes to PA/exercise.
Cross-sectional observational study recruiting consecutive patients with HCV from viral hepatitis clinics. Data were collected on CVD risk factors, anthropometry, HRQoL and the Exercise Benefits and Barriers Scale (EBBS).
86 patients were recruited (71% men, 94% white, age 52±13 years); 49% of the cohort self-reported to be currently active. Although HRQoL was reduced across the cohort, patients that were regularly ‘active’ reported significantly higher HRQoL scores across Short-Form 36v2 domains compared with their inactive counterparts (p<0.05). Metabolic and cardiovascular characteristics were no different between groups stratified by PA/exercise status (p>0.05). EBBS scores were similar in the ‘active’ versus ‘inactive’ groups, however, patients categorised as ‘active’ scored significantly higher on the psychological outlook and social interaction subscales (p<0.05) than those that were ‘inactive’. There were significant associations between EBBS scores and HRQoL (p<0.05).
PA/exercise is associated with increased HRQoL in patients with HCV irrespective of clinical parameters. Addressing specific motivators/barriers to exercise for patients will be key to designing effective PA/exercise interventions in this patient population to ensure maximum uptake and adherence.
Benign liver tumours (BLT) are increasingly diagnosed as incidentalomas. Clinical implications and management vary across and within the different types of BLT. High-quality clinical practice guidelines are needed, because of the many nuances in tumour types, diagnostic modalities, and conservative and invasive management strategies. Yet, available observational evidence is subject to interpretation which may lead to practice variation. Therefore, we aimed to systematically search for available clinical practice guidelines on BLT, to critically appraise them, and to compare management recommendations.
A scoping review was performed within MEDLINE, EMBASE, and Web of Science. All BLT guidelines published in peer-reviewed, and English language journals were eligible for inclusion. Clinical practice guidelines on BLT were analysed, compared, and critically appraised using the Appraisal of Guidelines, Research and Evaluation (AGREE II) checklist regarding hepatic haemangioma, focal nodular hyperplasia (FNH), and hepatocellular adenoma (HCA). Preferred Reporting Items for Systematic Reviews and Meta-Analyses recommendations (PRISMA) for scoping reviews were adhered to.
The literature search yielded unique 367 papers, 348 were excluded after screening of title/abstract, and 16 after full-text screening. Three guidelines were included: the American College of Gastroenterology (ACG; 2014), Brazilian Society of Hepatology (SBH; 2015), and European Association for the Study of the Liver (EASL; 2016). There was no uniformity in the assessment methods for grading and gravity of recommendations between guidelines. Among observed differences were: (1) indications for biopsy in all three tumours; (2) advices on contraceptive pills and follow-up in FNH and HCA; (3) use of an individualised approach to HCA; (4) absence of recommendations for treatment of HCA in men; and (5) approaches to HCA subtype identification on magnetic resonance imaging.
Recognising differences in recommendations can assist in harmonisation of practice standards and identify unmet needs in research. This may ultimately contribute to improved global patient care.
The aims of this study were to describe community antibiotic prescribing patterns in individuals hospitalised with COVID-19, and to determine the association between experiencing diarrhoea, stratified by preadmission exposure to antibiotics, and mortality risk in this cohort.
Retrospective study of the index presentations of 1153 adult patients with COVID-19, admitted between 1 March 2020 and 29 June 2020 in a South London NHS Trust. Data on patients’ medical history (presence of diarrhoea, antibiotic use in the previous 14 days, comorbidities); demographics (age, ethnicity, and body mass index); and blood test results were extracted. Time to event modelling was used to determine the risk of mortality for patients with diarrhoea and/or exposure to antibiotics.
19.2% of the cohort reported diarrhoea on presentation; these patients tended to be younger, and were less likely to have recent exposure to antibiotics (unadjusted OR 0.64, 95% CI 0.42 to 0.97). 19.1% of the cohort had a course of antibiotics in the 2 weeks preceding admission; this was associated with dementia (unadjusted OR 2.92, 95% CI 1.14 to 7.49). After adjusting for confounders, neither diarrhoea nor recent antibiotic exposure was associated with increased mortality risk. However, the absence of diarrhoea in the presence of recent antibiotic exposure was associated with a 30% increased risk of mortality.
Community antibiotic use in patients with COVID-19, prior to hospitalisation, is relatively common, and absence of diarrhoea in antibiotic-exposed patients may be associated with increased risk of mortality. However, it is unclear whether this represents a causal physiological relationship or residual confounding.
Colorectal cancer (CRC) is the third most common cancer for women and men and the second leading cause of cancer death in the USA. There is emerging evidence that the gut microbiome plays a role in CRC development, and antibiotics are one of the most common exposures that can alter the gut microbiome. We performed a systematic review and meta-analysis to characterise the association between antibiotic use and colorectal neoplasia.
We searched PubMed, EMBASE, and Web of Science for articles that examined the association between antibiotic exposure and colorectal neoplasia (cancer or adenoma) through 15 December 2019. A total of 6031 citations were identified and 6 papers were included in the final analysis. We assessed the association between the level of antibiotic use (defined as number of courses or duration of therapy) and colorectal neoplasia using a random effects model.
Six studies provided 16 estimates of the association between level of antibiotic use and colorectal neoplasia. Individuals with the highest levels of antibiotic exposure had a 10% higher risk of colorectal neoplasia than those with the lowest exposure (effect size: 1.10, 95% CI 1.01 to 1.18). We found evidence of high heterogeneity (I2=79%, p=0.0001) but not of publication bias.
Higher levels of antibiotic exposure is associated with an increased risk of colorectal neoplasia. Given the widespread use of antibiotics in childhood and early adulthood, additional research to further characterise this relationship is needed.
The impact of COVID-19 on pregnant inflammatory bowel disease (IBD) patients is currently unknown. Reconfiguration of services during the pandemic may negatively affect medical and obstetric care. We aimed to examine the impacts on IBD antenatal care and pregnancy outcomes.
Retrospective data were recorded in consecutive patients attending for IBD antenatal care including outpatient appointments, infusion unit visits and advice line encounters.
We included 244 pregnant women with IBD, of which 75 (30.7%) were on biologics in whom the treatment was stopped in 29.3% at a median 28 weeks gestation. In addition, 9% of patients were on corticosteroids and 21.5% continued on thiopurines. The care provided during 460 patient encounters was not affected by the pandemic in 94.1% but 68.2% were performed via telephone (compared with 3% prepandemic practice; p<0.0001). One-hundred-ten women delivered 111 alive babies (mean 38.2 weeks gestation, mean birth weight 3324 g) with 12 (11.0%) giving birth before week 37. Birth occurred by vaginal delivery in 72 (56.4%) and by caesarean section in 48 (43.6%) cases. Thirty-three were elective (12 for IBD indications) and 15 emergency caesarean sections. Breast feeding rates were low (38.6%). Among 244 pregnant women with IBD, 1 suspected COVID-19 infection was recorded.
IBD antenatal care adjustments during the COVID-19 pandemic have not negatively affected patient care. Despite high levels of immunosuppression, only a single COVID-19 infection occurred. Adverse pregnancy outcomes were infrequent.
Patients with short bowel syndrome (SBS) and colon in continuity have better adaptation potential compared with patients with jejunostomy. Adaptation may involve enhanced postprandial secretion of the enteroendocrine hormones glucagon-like peptide (GLP)-1 and GLP-2 which are normally degraded by dipeptidyl peptidase (DPP)-4. Nevertheless, some patients with SBS with colon in continuity suffer from high-volume faecal excretions and have been shown to benefit from treatment with GLP-2. Therefore, we aimed to evaluate efficacy of sitagliptin, a DPP-4 inhibitor, on reducing faecal excretions in this patient group.
In an open-label, case series, proof-of-concept pilot study, 100 mg oral sitagliptin was given two times per day for 8 weeks to patients with SBS with ≥50% colon in continuity with or without the need for parenteral support (PS). To assess intestinal function, metabolic balance studies were done at baseline and following 8 weeks of treatment.
Of the 10 patients planned for enrolment, 8 patients were included; 7 patients completed the study. Although postprandial endogenous GLP-2 concentrations increased by 49 hoursxpmol/L (39, 105; p=0.018) (median (min, max)), sitagliptin did not significantly reduce median faecal wet weight (–174 g/day (–1510, 675; p=0.176)) or increase intestinal wet weight absorption. However, heterogeneity in the treatment effect was observed: intestinal wet weight absorption increased in all four patients with intestinal failure. One patient achieved a reduction in PS by 500 mL per administration day.
Following this negative, small pilot study, larger, placebo-controlled, studies are needed to establish the therapeutic potential of DPP-4 inhibition in patients with SBS.
Helicobacter pylori infection is a common cause of chronic gastritis worldwide and an established risk factor for developing gastric malignancy. The endoscopic appearances predicting H. pylori status are an ongoing area of research, as are their diagnostic accuracies. This study aimed to establish the diagnostic accuracy of several mucosal features predictive of H. pylori negative status and formulate a simple prediction model for use at the time of endoscopy.
Patients undergoing high-definition upper gastrointestinal (GI) endoscopy without magnification were recruited prospectively. During the endoscopy, the presence or absence of specific endoscopic findings was noted. Sydney protocol biopsies were used as the diagnostic reference standard, and urease test if taken. The results informed a logistic regression model used to produce a simple diagnostic approach. This model was subsequently validated using a further cohort of 30 patients.
153 patients were recruited and completed the study protocol. The prevalence of active H. pylori infection was 18.3% (28/153). The overall diagnostic accuracy of the simple prediction model was 80.0%, and 100% of patients with active H. pylori infection were correctly classified. The presence of regular arrangement of collecting venules (RAC) showed a positive predictive value for H. pylori naïve status of 90.7%, rising to 93.6% for patients under the age of 60.
A simple endoscopic model may be accurate for predicting H. pylori status of a patient, and the need for biopsy-based tests. The presence of RAC in the stomach is an accurate predictor of H. pylori negative status, particularly in patients under the age of 60.
The study was registered with ClinicalTrials.gov, No. NCT02385045.
The utility of routine head CT (HCT) in hepatic encephalopathy (HE) evaluation is unclear. We investigated HCT yield in detecting acute intracranial abnormalities in cirrhotic patients presenting with HE.
Retrospective review of cirrhotic patient encounters with HE between 2016 and 2018 at Beaumont Health, in Michigan was performed. A low-risk (LR) indication for HCT was defined as altered mental status (AMS), which included dizziness and generalised weakness. A high-risk (HR) indication was defined as trauma/fall, syncope, focal neurological deficits (FNDs) or headache. Descriptive statistics and univariate/multivariate analyses by logistic regression were performed using SPSS to identify HCT abnormality correlates.
Five hundred twenty unique encounters were reviewed. Mean age was 63.4 (12.1) years, 162 (37.5%) had alcoholic cirrhosis and median Model for End-Stage Liver Disease (MELD)-score was 17 (13–23). LR indication was reported in 408 (78.5%) patients and FNDs reported in 24 (4.6%) patients. Only 13 (2.5%) patients were found to have an acute intracranial pathology (seven haemorrhagic stroke, two ischaemic stroke, four subdural haematoma). Aspirin use prior to presentation (aOR 4.6, 95% CI 1.1 to 19.2), and HR indication (aOR 7.3, 95% CI 2.3 to 23.8) were independent correlates of acute intracranial pathology on HCT. Age, sex, MELD-score, haemoglobin, platelet count, race and cirrhosis aetiology did not correlate with HCT abnormalities. Number needed to screen to identify one acute pathology was 14 in HR indications versus 82 for LR indications.
Routine HCTs in cirrhotic patients presenting with HE with AMS in the absence of history of trauma, headache, syncope, FNDs or aspirin use is of low diagnostic yield.
Fibrotic strictures in the gastrointestinal tract are frequent in Crohn’s disease. Endoscopic dilation is a standard treatment. However, recurrence is common after dilation and there are complications such as bleeding or perforation. Endoscopic treatment using self-expandable metal stents has shown diverging results. The aim of this study was to evaluate the outcome of endoscopic treatment with a self-expandable stent in ileocecal Crohn’s disease.
Patients with Crohn’s disease and a symptomatic ileocecal stricture were eligible for prospective, consecutive inclusion in a single-centre setting. Patients were randomised to treatment with either 18 mm balloon dilatation (GroupDIL) or stenting (GroupSTENT) using a 20 mm diameter, partially covered Hanarostent NCN. Patients were followed for a minimum of 24 months postendoscopy. Outcomes were technical success, adverse events and clinical success (defined as no need for repeated interventions).
Thirteen patients (GroupDIL n=6; GroupSTENT=7) were included with twelve patients (GroupDIL n=5; GroupSTENT=7) being eligible for complete follow-up. Technical success was achieved in all cases. Adverse events were border-line significantly more common in the GroupSTENT: 4/7 (57%) (pain: n=3; pain and rectal bleeding: n=1) compared with the GroupDIL: 0/5 (0%), p=0.08, which resulted in preterm termination of the study. The clinical success rate was GroupSTENT: 6/7 (86%) vs GroupDIL: 1/5 (20%), p=0.07.
Patients with strictures related to Crohn’s disease may benefit from treatment with self-expandable metal stents rather than dilatation. However, there seems to be an increased risk for patient pain after stenting, which has to be considered and handled.
The study was registered at Clinical Trials (NCT04718493).
We evaluated the effect on clinical outcomes of implementing a standardised inpatient order set for patients admitted with hepatic encephalopathy (HE).
A retrospective review of patients with cirrhosis admitted with HE. Hospital admissions for HE for which the electronic health record (EHR) order set was used were compared with admissions where the order set was not used. Primary outcome was length of hospital stay (LOS). Secondary outcomes were 30-day readmissions, in-hospital complications, in-hospital and 90-day mortality.
There were 341 patients with 980 admissions over the study period: 263 patients with 736 admissions where the order set was implemented, and 78 patients with 244 admissions where the order set was not implemented. Median LOS was 4 days (IQR 3–8) in the order set group compared with 3 days (IQR 2–7) (p<0.001); incidence rate ratio 1.37 (95% CI 1.20 to 1.57), p<0.001. 30-day readmissions rate was 56% in the order set group compared with 40%, p=0.01; OR for readmission was 1.88 (95% CI 1.04 to 3.43), p=0.04. Hypokalaemia occurred in 46% of admissions with order set use compared with 36%, when the order set was not used; p=0.003, OR 1.72 (95% CI 1.22 to 2.43), p=0.002. No significant differences were seen for in-hospital mortality and 90-day mortality.
Implementation of an inpatient EHR order set for use in patients with HE was associated with unexpected clinical outcomes including increased LOS and readmissions. The convenience and advantages of standardisation of patient care should be balanced with a degree of individualisation, particularly in the care of medically complex patients. Furthermore, standardised processes should be evaluated frequently after implementation to assess for unintended consequences.
Peritoneal or mesenteric tumours may correspond to several tumour types or tumour-like conditions, some of them being represented by histiocytosis. This rare condition often poses diagnostic difficulties that can lead to important time delay in targeted therapies. Our aim was to describe main features of histiocytoses with mesenteric localisation that can improve the diagnostic process.
We performed a retrospective study on 22 patients, whose peritoneal/mesenteric biopsies were infiltrated by histiocytes.
Abdominal pain was the revealing symptom in 10 cases, and 19 patients underwent surgical biopsies. The diagnosis of histiocytosis was proposed by initial pathologists in 41% of patients. The other initial diagnoses were inflammation (n=7), sclerosing mesenteritis (n=4) and liposarcoma (n=1). The CD163/CD68+CD1a- histiocytes infiltrated subserosa and/or deeper adipose tissues in 16 and 14 cases, respectively. A BRAFV600E mutation was detected within the biopsies in 11 cases, and two others were MAP2K1 mutated. The final diagnosis was histiocytosis in 18 patients, 15 of whom had Erdheim-Chester disease. The median diagnostic delay of histiocytosis was 9 months. Patients treated with BRAF or MEK inhibitors showed a partial response or a stable disease. One patient died soon after surgery, and five died by the progression of the disease.
Diagnosis of masses arising in the mesentery should be carefully explored as one of the possibilities in histiocytosis. This diagnosis is frequently missed on mesenteric biopsies. Molecular biology for detecting the mutations in BRAF or in genes of the MAP kinase pathway is a critical diagnostic tool.
Percutaneous endoscopic gastrostomy is a commonly used endoscopic technique where a tube is placed through the abdominal wall mainly to administer fluids, drugs and/or enteral nutrition. Several placement techniques are described in the literature with the ‘pull’ technique (Ponsky-Gardener) as the most popular one. Independent of the method used, placement includes a ‘blind’ perforation of the stomach through a small acute surgical abdominal wound. It is a generally safe technique with only few major complications. Nevertheless these complications can be sometimes life-threatening or generate serious morbidity.
A narrative review of the literature of major complications in percutaneous endoscopic gastrostomy.
This review was written from a clinical viewpoint focusing on prevention and management of major complications and documented scientific evidence with real cases from more than 20 years of clinical practice.
Major complications are rare but prevention, early recognition and popper management are important.
The differential diagnosis and management of seronegative enteropathies is challenging due to the rarity of these conditions, the overlap of clinical and histopathological features and the current lack of an international consensus on their nomenclature.
This is a narrative review providing pragmatic guide on the investigation and clinical management of seronegative enteropathies in adults based on the available literature and our clinical experience.
Seronegative coeliac disease is the most frequent cause among the heterogeneous group of seronegative enteropathies and its diagnosis is confirmed by the clinical and histological response to a gluten-free diet after the exclusion of other causes of villous atrophy. Correct identification and targeted management of seronegative enteropathies is mandatory because of the variation in terms of clinical outcomes and prognosis.
Following the disruption of normal paediatric inflammatory bowel disease (IBD) services during the peak of the COVID-19 pandemic, we prospectively audited the first-time use of home faecal calprotectin testing. We aimed to provide an alternative to laboratory tests and to assess the value of home testing as part of our regular services going forward.
Home test kits as well as accompanying user instructions were made available to our patients with paediatric IBD that required faecal calprotectin test between 17 April and 12 August 2020. Once the user completed the test, results were automatically uploaded to the result portal and clinical staff were alerted. A user feedback questionnaire was sent to users that had completed the home test.
Of the 54 patients, 41 (76%) aged between 4.7 and 18.1 years used the home test. A total of 45 home tests were done, one of which produced an invalid result. The decision to modify management was made in 12 (29%) of the patients, while 14 (34%) had no changes made and 15 (37%) required further assessment. Twenty (48.8%) responded to the questionnaire and 85% stated that they preferred the home test to the laboratory testing method.
Home calprotectin tests were useful in guiding clinical management during a time when laboratory testing was less available. They may offer benefits as part of routine paediatric IBD monitoring to help target appointments and reduce unnecessary hospital attendances in the future.
Despite clear evidence that weight loss via nutritional and physical activity changes improves histological outcomes in non-alcoholic fatty liver disease (NAFLD), many patients struggle to implement and maintain these health behaviour changes. The aim of this study was to characterise disease knowledge, attitudes and behaviours among persons with NAFLD and to identify the factors driving these health behaviours and perceptions.
We conducted semistructured interviews among patients with NAFLD. We used purposeful sampling to enroll equivalent percentages based on age and sex, and enrolled approximately one-third of patients with cirrhosis to capture those perspectives. Interviews were conducted until thematic saturation was achieved. Transcripts were coded using NVivo software to identify themes and subthemes.
A total of 29 patient interviews were completed. Ambiguity about the diagnosis and aetiology of their liver disease was a key theme, though the vast majority of patients were aware that weight loss via nutrition and exercise was the primary therapy. Most patients were asymptomatic, diagnosed incidentally, and reported low level of concern regarding their diagnosis. The primary barriers and facilitators to health behaviour change were the presence of social support, competing medical comorbidities and low motivation to change behaviours.
Although patients are aware that lifestyle interventions are the primary therapy for NAFLD, there is a gap in knowledge about the condition. The presence of social support and competing medical comorbidities were the most consistent facilitators and barriers to lifestyle change. Tailoring treatment recommendations to provide relevant disease education, specific nutrition and exercise regimens, and personalised approaches based on specific individual barriers and facilitators will likely aid in uptake and maintenance of first-line therapy for NAFLD.
Due to high rates of obesity and alcohol consumption, the prevalence of fatty liver disease is increasing. There is no widely adopted approach to proactively screen for liver disease in the community. We aimed to assess the burden of potentially undiagnosed liver disease in individuals attending for colonoscopy to develop a pathway to identify and manage individuals with undiagnosed liver disease.
The OSCAR Study was a cross-sectional study recruiting patients attending for colonoscopy. Patients’ metabolic and liver risk factors were measured. The prevalence of undiagnosed significant fatty liver disease was measured using the Fatty Liver Index (FLI) and Fibrosis-4 score (FIB-4).
1429 patients (mean age 59±14 years; 48.8% men) were recruited. 73.3% were overweight/obese, 12.7% had diabetes and 17.9% had metabolic syndrome. 19% were consuming more than recommenced alcohol levels (<14 units/week) and 41% had an AUDIT-C score ≥5. After excluding those with known liver disease, 43.2% of the cohort had a high FLI (high likelihood of fatty liver). 5.3% of these had a high FIB-4 score (>2.67, high probability of advanced fibrosis) and 90% of these were previously undiagnosed. 818 patients had a predicted 10-year cardiovascular event risk of ≥10%, however only 377 (46.1%) were on statin therapy.
High levels of obesity, metabolic dysfunction and undiagnosed fatty liver disease were found in individuals attending for colonoscopy. Clinical encounters in the endoscopy unit may represent an opportunity to risk assess for liver and metabolic disease and provide an environment to develop targeted interventions.
Colorectal cancer (CRC) is the fourth most common cancer in UK. Symptomatic patients are referred via an urgent pathway and although most are investigated with colonoscopy <4% are diagnosed with cancer. There is therefore a need for a suitable triage tool to prioritise investigations. This study retrospectively examined performance of various triage tools in patients awaiting investigation on the urgent lower gastrointestinal cancer pathway
All patients over 40 years of age on the urgent pathway awaiting investigation for suspected CRC on 1 May were included. After 6 months, outcomes were evaluated and the performance of the faecal immunochemical test (FIT), faecal haemoglobin concentration, age and sex test (FAST) and the artificial intelligence algorithm ColonFlag were examined.
532 completed investigations and received a diagnosis; 15 had CRC. 388 had a valid FIT result, of whom 11 had CRC; FAST Score ≥4.5 had sensitivity of 72.7%, specificity of 80.6% and would have failed to detect three tumours. Faecal haemoglobin (f-Hb) at cut-off of 10 µg/g and ColonFlag had equal sensitivity of 81.82%, ColonFlag had greater specificity 73.47%, compared with 64.99%. Both tests would have failed to detect two tumours but not in the same patients; when used in combination, sensitivity and specificity were 100% and 49.4%. When ColonFlag was applied to the cohort of 532, an additional four tumours would have been detected in patients without a valid FIT.
This study showed ColonFlag to have equal sensitivity and greater specificity than f-Hb at a cut-off of 10 µg/g as a triage tool for CRC
A high quality end-expiratory breath sample is required for a reliable gastrointestinal breath test result. Oxygen (O2) concentration in the breath sample can be used as a quality marker. This study investigated the characteristics of O2 concentration in the breath sample and the impact of using a correction factor in real-time breath measurement.
This study includes two separate groups of patient data. Part 1 of the study analysed the patient’s ability to deliver end-expiratory breath samples over a 2-year period (n=564). Part 2 of the study analysed a separate group of patients (n=47) with additional data to investigate the O2 characteristics and the role of correction factor in breath test.
The results indicated 95.4% of 564 patients were able to achieve an O2 concentration below 14% in their end-expiratory breath. Part 2 of the study revealed that the distribution of O2 concentration was between 9.5% and 16.2%. Applying a correction factor to predict the end-expiratory H2 and CH4 values led to an average measurement error of –36.4% and –12.8%, respectively.
The majority of patients are able to deliver a high quality end-expiratory breath sample, regardless of age or gender. The correction factor algorithm is unreliable when predicting the end-expiratory result at 15% O2 and it would have resulted in false negative result for 50% of the positive cases in this study. It has also indicated that the continuous O2 measurement is essential to ensure breath sample quality by preventing secondary breathing during real-time breath collection.
Transjugular intrahepatic portosystemic stent shunt (TIPSS) is clinically effective in variceal bleeding and refractory ascites; however, the cost-effectiveness of TIPSS has yet to be evaluated in the UK. This study aimed to establish the cost-effectiveness of (i) pre-emptive TIPSS versus endoscopic band ligation (EBL) in populations with variceal bleeding and (ii) TIPSS versus large volume paracentesis (LVP) in refractory ascites.
A cost-utility analysis was conducted with the perspective including healthcare costs and quality-adjusted life years (QALYs). A Markov model was constructed with a 2-year time horizon, health states for mortality and survival and probabilities for the development of variceal bleeding, ascites and hepatic encephalopathy. A survival analysis was conducted to extrapolate 12-month to 24-month mortality for the refractory ascites indication. Uncertainty was analysed in deterministic and probabilistic sensitivity analyses.
TIPSS was cost-effective (dominant) and cost saving for both indications. For variceal bleeding, pre-emptive TIPSS resulted in 0.209 additional QALYs, and saved £600 per patient compared with EBL. TIPSS had a very high probability of being cost-effective (95%) but was not cost saving in scenario analyses driven by rates of variceal rebleeding. For refractory ascites, TIPSS resulted in 0.526 additional QALYs and saved £17 983 per patient and had a 100% probability of being cost-effective and cost saving when compared with LVP.
TIPSS is a cost-effective intervention for variceal bleeding and refractory ascites. TIPSS is highly cost-saving for refractory ascites. Robust randomised trial data are required to confirm whether pre-emptive TIPSS is cost saving for variceal bleeding.
Exercise is emerging as a therapy in oncology for its physical and psychosocial benefits and potential effects on chemotherapy tolerability and efficacy. However, evidence from randomised controlled trials (RCTs) supporting exercise in patients with borderline resectable or locally advanced pancreatic cancer (PanCa) undergoing neoadjuvant therapy (NAT) are lacking.
The EXPAN trial is a dual-centre, two-armed, phase I RCT. Forty patients with borderline resectable or locally advanced PanCa undergoing NAT will be randomised equally to an exercise intervention group (individualised exercise+standard NAT) or a usual care control group (standard NAT). The exercise intervention will be supervised and consist of moderate to vigorous intensity resistance and aerobic-based training undertaken two times a week for 45–60 min per session for a maximum period of 6 months. The primary outcome is feasibility. Secondary outcomes are patient-related and treatment-related endpoints, objectively measured physical function, body composition, psychological health and quality of life. Assessments will be conducted at baseline, prior to potential alteration of treatment (~4 months postbaseline), at completion of the intervention (maximum 6 months postbaseline) and 3-month and 6-month postintervention (maximum 9 and 12 months postbaseline).
The EXPAN trial has been approved by Edith Cowan University (reference no.: 2020-02011-LUO), Sir Charles Gairdner Hospital (reference no.: RGS 03956) and St John of God Subiaco Hospital (reference no.: 1726). The study results will be presented at national/international conferences and submitted for publications in peer-reviewed journals.
ACTRN12620001081909.
Pancreatic exocrine insufficiency is a finding in many conditions, predominantly affecting those with chronic pancreatitis, pancreatic cancer and acute necrotising pancreatitis. Patients with pancreatic exocrine insufficiency can experience gastrointestinal symptoms, maldigestion, malnutrition and adverse effects on quality of life and even survival.
There is a need for readily accessible, pragmatic advice for healthcare professionals on the management of pancreatic exocrine insufficiency.
A review of the literature was conducted by a multidisciplinary panel of experts in pancreatology, and recommendations for clinical practice were produced and the strength of the evidence graded. Consensus voting by 48 pancreatic specialists from across the UK took place at the 2019 Annual Meeting of the Pancreatic Society of Great Britain and Ireland annual scientific meeting.
Recommendations for clinical practice in the diagnosis, initial management, patient education and long term follow up were developed. All recommendations achieved over 85% consensus and are included within these comprehensive guidelines.
Global survival studies have shown favourable development in colon and rectal cancers but few studies have considered extended periods or covered populations for which medical care is essentially free of charge.
We analysed colon and rectal cancer survival in Finland and Sweden over a 50-year period (1967–2016) using data from the Nordcan database. In addition to the standard 1-year and 5-year survival rates, we calculated the difference between these as a novel measure of how well survival was maintained between years 1 and 5.
Relative 1-year and 5-year survival rates have developed favourably without major shifts for men and women in both countries. For Finnish men, 1-year survival in colon cancer increased from 50% to 82%, and for rectal cancer from 62% to 85%. The Swedish survival was a few per cent unit better for 1-year survival but for 5-year survival the results were equal. Survival of female patients for both cancers was somewhat better than survival in men through 50 years. Overall the survival gains were higher in the early compared with the late follow-up periods, and were the smallest in the last 10 years. The difference between 1-year and 5-year survival in colon cancer was essentially unchanged over the 50-year period while in rectal cancer there was a large improvement.
The gradual positive development in survival suggests a contribution by many small improvements rather than single breakthroughs. The improvement in 5-year survival in colon cancer was almost entirely driven by improvement in 1-year survival while in rectal cancer the positive development extended to survival past year 1, probably due to successful curative treatments. The current challenges are to reinvigorate the apparently stalled positive development and to extend them to old patients. For colon cancer, survival gains need to be extended past year 1 of diagnosis.
Hepatic encephalopathy (HE) is defined as brain dysfunction that occurs because of acute liver failure or liver cirrhosis and is associated with significant morbidity and mortality. Lactulose is the standard of care till this date; however, polyethylene glycol (PEG) has gained the attention of multiple investigators.
We screened five databases namely PubMed, Scopus, Web of Science, Cochrane Library and Embase from inception to 10 February 2021. Dichotomous and continuous data were analysed using the Mantel-Haenszel and inverse variance methods, respectively, which yielded a meta-analysis comparing PEG versus lactulose in the treatment of HE.
Four trials with 229 patients were included. Compared with lactulose, the pooled effect size demonstrated a significantly lower average HE Scoring Algorithm (HESA) Score at 24 hours (Mean difference (MD)=–0.68, 95% CI (–1.05 to –0.31), p<0.001), a higher proportion of patients with reduction of HESA Score by ≥1 grade at 24 hours (risk ratio (RR)=1.40, 95% CI (1.17 to 1.67), p<0.001), a higher proportion of patients with a HESA Score of grade 0 at 24 hours (RR=4.33, 95% CI (2.27 to 8.28), p<0.0010) and a shorter time to resolution of HE group (MD=–1.45, 95% CI (–1.72 to –1.18), p<0.001) in favour of patients treated with PEG.
PEG leads to a higher drop in the HESA Score and thus leads to a faster resolution of HE compared with lactulose.
Measuring patient experience of gastrointestinal (GI) procedures is a key component of evaluation of quality of care. Current measures of patient experience within GI endoscopy are largely clinician derived and measured; however, these do not fully represent the experiences of patients themselves. It is important to measure the entirety of experience and not just experience directly during the procedure. We aimed to develop a patient-reported experience measure (PREM) for GI procedures.
Phase 1: semi-structured interviews were conducted in patients who had recently undergone GI endoscopy or CT colonography (CTC) (included as a comparator). Thematic analysis identified the aspects of experience important to patients. Phase 2: a question bank was developed from phase 1 findings, and iteratively refined through rounds of cognitive interviews with patients who had undergone GI procedures, resulting in a pilot PREM. Phase 3: patients who had attended for GI endoscopy or CTC were invited to complete the PREM. Psychometric properties were investigated. Phase 4 involved item reduction and refinement.
Phase 1: interviews with 35 patients identified six overarching themes: anxiety, expectations, information & communication, embarrassment & dignity, choice & control and comfort. Phase 2: cognitive interviews refined questionnaire items and response options. Phase 3: the PREM was distributed to 1650 patients with 799 completing (48%). Psychometric properties were found to be robust. Phase 4: final questionnaire refined including 54 questions assessing patient experience across five temporal procedural stages.
This manuscript gives an overview of the development and validation of the Newcastle ENDOPREM™, which assesses all aspects of the GI procedure experience from the patient perspective. It may be used to measure patient experience in clinical care and, in research, to compare patients’ experiences of different endoscopic interventions.
Endosonography (EUS) is a useful but complex diagnostic modality which requires advanced endoscopy training and guidance by a supervisor. Since learning curves vary among individuals, assessment of the actual competence among EUS trainees is important.
We designed a novel assessment tool entitled Global Assessment of Performance and Skills in EUS (GAPS-EUS) for assessing skills among EUS trainees. Five quality indicators were marked on a five-grade scale by the supervisor (Observer Score) and by the trainee (Trainee Score). Trainees were included in two high-volume centres (Gothenburg, Sweden, and Bologna, Italy). Outcomes were feasibility, patient safety, reliability, and validity of GAPS-EUS in trainee-performed EUS procedures.
Twenty-two EUS-trainees were assessed in a total of 157 EUS procedures with a completion rate of 157/157 (100 %) and a patient adverse event rate of 2/157 (1.3 %; gastroenteritis n=1, fever n=1). GAPS-EUS showed a high measurement reliability (Cronbach’s alpha coefficient=0.87) and a high inter-rater reliability comparing the supervisor and the trainee (r=0.83, r2=0.69, p<0.001). The construct of GAPS-EUS was verified by comparing low-level and high-level performance procedures and the content validity by recording that the EUS-FNA manoeuvre resulted in a lower score than other aspects of EUS 3.07 (95% CI 2.91 to 3.23) vs 3.51 (95% CI 3.37 to 3.65) (p<0.001). External validity was confirmed via similar findings in both centres.
GAPS-EUS is an easy-to-use and reliable tool with a recorded high validity for the assessment of competence among trainees in EUS. It can be recommended to centres involved in the education of future endosonographers.
Anal adenocarcinoma is a rare malignancy with a poor prognosis.
We present a case of rare anal adenocarcinoma in a patient with normal screening colonoscopy. Using the Surveillance, Epidemiology and End Result database between 2000 and 2016, we performed survival analysis among individuals>20 years old comparing anal and rectal cancers.
Survival analysis showed that anal adenocarcinoma is associated with worse outcomes compared with rectal adenocarcinoma and anal squamous cell carcinoma.
This case and survival data illustrate the importance of prompt investigation of symptoms irrespective of colorectal cancer screening status with careful attention to examination of the anal area.
Serum albumin is used as a marker of acute inflammation. Several studies have addressed the association between serum albumin and clinical outcome in patients with ulcerative colitis (UC). While mucosal healing (MH) has been indicated as the therapeutic goal for UC, the association between serum albumin and MH remains unclear. We evaluated this issue in patients with UC overall and explored whether duration of UC affected this association.
This cross-sectional study recruited consecutive patients with UC. Study subjects consisted of 273 Japanese patients with UC. Serum albumin was divided into tertiles based on its distribution in all study subjects. One endoscopy specialist was responsible for measuring partial MH and MH, which were defined as a Mayo endoscopic subscore of 0–1 and 0, respectively. The association between serum albumin and clinical outcomes was assessed by multivariate logistic regression.
Rates of clinical remission, partial MH and MH were 57.9%, 63% and 26%, respectively. Only high serum albumin (>4.4 mg/dL) was significantly positively associated with MH (OR 2.29 (95% CI: 1.03 to 5.29), p for trend=0.043). In patients with short UC duration (<7 years) only, high serum albumin was significantly positively associated with MH and clinical remission. In patients with long UC duration (≥7 years), in contrast, no association between serum albumin and clinical outcomes was found.
In Japanese patients with UC, serum albumin was significantly positively associated with MH. In patients with short UC duration, serum albumin might be a useful complementary marker for MH.
COVID-19 has put a strain on regular healthcare worldwide. For inflammatory bowel disease (IBD), gastrointestinal surgeries were postponed and changes in treatment and diagnostic procedures were made. As abrupt changes in treatment regimens may result in an increased morbidity and consequent well-being of patients with IBD, the aim of this study was to determine the effect of the COVID-19 pandemic on health-related quality of life (HRQoL) in patients with IBD.
All patients with IBD who completed both Inflammatory Bowel Disease Questionnaire (IBDQ) and 36-Item Short Form Health Survey (SF-36) questionnaire between 31 August and 13 September 2020 were included in our cohort study. The primary end point was to determine the HRQoL in patients with IBD, measured by the IBDQ and SF-36 questionnaire. The secondary end point was determining which factors influence the HRQoL in patients with IBD.
582 patients with IBD filled in the IBDQ and SF-36 questionnaire. The HRQoL in our study population was low according to the questionnaires on both physical and mental subscales. In addition, multivariate analysis showed that increased age, female sex and patients who underwent surgery had a significantly lower HRQoL, most frequently on the physical domains in both questionnaires.
Patients with IBD had an overall low HRQoL during the COVID-19 pandemic. Furthermore, older patients, women and patients who underwent surgical procedures had the lowest physical HRQoL.
The effectiveness of early cholecystectomy for gallstone diseases treatment is uncertain compared with conservative management/delayed cholecystectomy.
To synthesise treatment outcomes of early cholecystectomy versus conservative management/delayed cholecystectomy in terms of its safety and effectiveness.
We systematically searched randomised control trials investigating the effectiveness of early cholecystectomy compared with conservative management/delayed cholecystectomy. We pooled the risk ratios with a 95% CI, also estimated adjusted number needed to treat to harm.
Of the 40 included studies for systematic review, 39 studies with 4483 patients are included in meta-analysis. Among the risk ratios of gallstone complications, pain (0.38, 0.20 to 0.74), cholangitis (0.52, 0.28 to 0.97) and total biliary complications (0.33, 0.20 to 0.55) are significantly lower with early cholecystectomy. Adjusted number needed to treat to harm of early cholecystectomy compared with conservative management/delayed cholecystectomy are, for pain 12.5 (8.3 to 33.3), biliary pancreatitis >1000 (50–100), common bile duct stones 100 (33.3 to 100), cholangitis (100 (25–100), total biliary complications 5.9 (4.3 to 9.1) and mortality >1000 (100 to100 000).
Early cholecystectomy may result in fewer biliary complications and a reduction in reported abdominal pain than conservative management.
2020 CRD42020192612.
Adenoma detection rate (ADR) and sessile serrated lesion detection rate (SSLDR) vary among physicians. We sought to determine physician characteristics associated with ADR and SSLDR in a population-based colon screening programme.
Retrospective study of 50–74 year olds with positive faecal immunochemical test and colonoscopy from 15/11/2013 to 31/12/2018. Physician characteristics included: gender, specialty, year and country of medical school graduation, colonoscopy volume and Direct Observation of Procedural Skills (DOPS) performance. Multivariable regression was performed on the following dependent variables: ADR, advanced ADR, proximal and distal ADR, SSLDR, proximal and distal SSLDR.
104 326 colonoscopies were performed by 261 physicians. A higher ADR was associated with gastroenterology (OR for general surgery 0.87, 95% CI 0.80 to 0.95; OR for general/family/internal medicine 0.70, 95% CI 0.55 to 0.88), fewer years since graduation (OR for graduation
Higher ADR, SSLDR and proximal SSLDR was associated with gastroenterology specialty and improved performance on DOPS.
Clinical guidelines recommend weight loss to manage non-alcoholic fatty liver disease (NAFLD). However, the majority of patients find weight loss a significant challenge. We identified factors associated with engagement and adherence to a low-energy diet (LED) as a treatment option for NAFLD.
23 patients with NAFLD enrolled in a LED (~800 kcal/day) were individually interviewed. Transcripts were thematically analysed.
14/23 patients achieved ≥10% weight loss, 18/23 achieved ≥7% weight loss and 19/23 achieved ≥5% weight loss. Six themes were generated from the data. A desire to achieve rapid weight loss to improve liver health and prevent disease progression was the most salient facilitator to engagement. Early and significant weight loss, accountability to clinicians and regular appointments with personalised feedback were facilitators to engagement and adherence. The desire to receive positive reinforcement from a consultant was a frequently reported facilitator to adherence. Practical and emotional support from friends and family members was critically important outside of the clinical setting. Irregular working patterns preventing attendance at appointments was a barrier to adherence and completion of the intervention.
Engagement and adherence to a LED in patients with NAFLD were encouraged by early and rapid weight loss, personalised feedback and positive reinforcement in the clinical setting combined with ongoing support from friends and family members. Findings support those identified in patients who completed a LED to achieve type 2 diabetes remission and highlight the importance of behaviour change support during the early stages of a LED to promote adherence.
Tumour necrosis factor signalling via the receptor-interacting protein kinase 1 (RIPK1) pathway regulates colonic inflammation suggesting that RIPK1 inhibition may be a potential therapeutic target in ulcerative colitis (UC). This phase IIa, randomised, double-blind experimental medicine study investigated the safety, pharmacokinetics (PK), pharmacodynamics (PD) and preliminary efficacy of the RIPK1 inhibitor GSK2982772 in patients with active UC.
In part A, prior to a protocol amendment, one patient was randomised to receive GSK2982772 60 mg twice daily for 42 days. After the amendment, patients were randomised 2:1 to receive GSK2982772 60 mg or placebo three times daily for 42 days. In part B, all patients switched to open-label GSK2982772 60 mg three times daily for 42 days. Safety, PK, PD biomarkers, histological disease activity, clinical efficacy and quality of life were assessed at days 43 and 85.
Thirty-six patients were randomised (n=12, placebo/open-label GSK2982772; n=24, GSK2982772/open-label GSK2982772). Most adverse events were mild, with headache reported the most frequently across groups (placebo/open-label GSK2982772, n=2 (17%); GSK2982772/open-label GSK2982772, n=8 (33%)). GSK2982772 was well distributed into colonic tissue, with generally higher concentrations in colonic biopsy samples versus plasma. No apparent differences between treatment groups were observed for PD, histological disease activity, clinical disease activity or quality-of-life measures. At screening, all patients had Mayo endoscopic scores of 2 or 3. At day 43, no patients in the placebo/open-label GSK2982772 group achieved Mayo endoscopic scores of 0 or 1 vs 3/24 (13%) for GSK2982772/open-label GSK2982772. At day 85, 1/9 (11%) achieved scores of 0 or one for placebo/open-label GSK2982772 vs 3/22 (14%) for GSK2982772/open-label GSK2982772.
GSK2982772 was generally well tolerated, with no treatment-related safety concerns identified. However, no significant differences in efficacy were observed between treatment groups, suggesting that GSK2982772 as monotherapy is not a promising treatment for patients with active UC.
Non-alcoholic fatty liver disease is a prohaemostatic state with abnormal primary, secondary and tertiary haemostasis. Plasminogen activator inhibitor (PAI)-1 is the best-established marker for prohaemostasis in non-alcoholic fatty liver disease. While epidemiological studies demonstrate decompensated non-alcoholic steatohepatitis (NASH) cirrhosis patients have increased rates of venous thromboembolism, including portal vein thrombosis, mechanistic studies have focused exclusively on patients without or with compensated cirrhosis. We aimed to characterizecharacterise PAI-1 levels in decompensated NASH cirrhosis.
PAI-1 level was measured in consecutive adult liver transplant recipients immediately prior to liver transplantation. Multivariable models were constructed using linear regression to assess factors related to PAI-1 level.
Forty-six subjects with mean age 57 (IQR 53–62) years and Model for Endstage Liver Disease (MELD) score of 34 (IQR 30–40) were enrolled. Baseline characteristics were similar between NASH (n=10) and non-NASH (n=36) subjects except for rates of diabetes and hyperlipidaemia. Mean PAI-1 level was greater in NASH (53.9, 95% CI 33.3 to 74.5 mg/mL) when compared with non-NASH (36.1, 95% CI 28.7 to 43.5), p=0.040. NASH remained independently predictive of PAI-1 level prior to transplant on adjusted multivariable modelling (β 40.13, 95% CI 14.41 to 65.86, p=0.003). Conclusions: PAI-1 level is significantly elevated in decompensated NASH cirrhosis independent of other pro-haemostatic factors. This may explain the greater rates of venous thromboembolism in decompensated NASH cirrhosis. Future study focusing on prevention of venous thromboembolism in this population is paramount to improve patient-oriented outcomes given the high morbidity and mortality of venous thromboembolism and the significant impact it has on transplant candidacy.
The most common fatal complication of liver cirrhosis is haemorrhaging caused by variceal rupture. The prevention of the first variceal bleed is, therefore, an important clinical goal. Little is known about patients’ experience of treatments geared towards this, or of their perceptions of treatments prior to being exposed to them.
To explore the factors impacting patient preference for, and actual experience of carvedilol and variceal band ligation.
Semistructured interviews were conducted with 30 patients from across the UK at baseline, prior to random allocation to either carvedilol or variceal band ligation. Twenty patients were interviewed a second time at 6-month follow-up. Five patients who declined the trial were also interviewed. Data were analysed using thematic analysis.
There was no clear preference for either treatment pathway at baseline. Key factors reported by patients to influence their treatment preference included: negative experiences with key treatment processes; how long-term or short-term treatment was perceived to be; treatment misconceptions; concerns around polypharmacy and worries around treatment adherence. Patient treatment experience was influenced by their perceptions of treatment effectiveness; clinical surveillance; clinician interaction and communication, or lack thereof. Carvedilol-specific experience was also influenced by the manifestation of side effects and patient dosage routine. Variceal band ligation-specific experience was positively influenced by the use of sedation, and negatively influenced by the procedure recovery period.
These data do not support a view that the patient experience of beta-blockade for prevention of variceal bleeds is likely to be superior to variceal band ligation.
Clinical data comparing diagnostic strategies in the management of Helicobacter pylori-associated diseases are limited. Invasive and noninvasive diagnostic tests for detecting H. pylori infection are used in the clinical care of patients with dyspeptic symptoms. Modelling studies might help to identify the most cost-effective strategies. The objective of the study is to assess the cost-effectiveness of a ‘test-and-treat’ strategy with the urea breath test (UBT) compared with other strategies, in managing patients with H. pylori-associated dyspepsia and preventing peptic ulcer in the UK.
Cost-effectiveness models compared four strategies: ‘test-and-treat’ with either UBT or faecal antigen test (FAT), ‘endoscopy-based strategy’ and ‘symptomatic treatment’. A probabilistic cost-effectiveness analysis was performed using a simulation model in order to identify probabilities and costs associated with relief of dyspepsia symptoms (over a 4-week time horizon) and with prevention of peptic ulcers (over a 10-year time horizon). Clinical and cost inputs to the model were derived from routine medical practice in the UK.
For relief of dyspepsia symptoms, ‘test-and-treat’ strategies with either UBT (526/success) and FAT (518/success) were the most cost-effective strategies compared with ‘endoscopy-based strategy’ (1317/success) and ‘symptomatic treatment’ (1 029/success). For the prevention of peptic ulcers, ‘test-and-treat’ strategies with either UBT (208/ulcer avoided/year) or FAT (191/ulcer avoided/year) were the most cost-effective strategies compared with ‘endoscopy-based strategy’ (717/ulcer avoided/year) and ‘symptomatic treatment’ (651/ulcer avoided/year) (1 EUR=0,871487 GBP at the time of the study).
‘Test-and-treat’ strategies with either UBT or FAT are the most cost-effective medical approaches for the management of H. pylori-associated dyspepsia and the prevention of peptic ulcer in the UK. A ‘test-and-treat’ strategy with UBT has comparable cost-effectiveness outcomes to the current standard of care using FAT in the UK.
Multiple factors predispose patients with cirrhosis to sepsis and/or bacteraemia and this has a high mortality rate. Within different geographical regions there are marked differences in the prevalence of infection with multidrug-resistant organisms (MDR). This study examined risk factors for and outcomes of sepsis/bacteraemia in public hospital admissions with cirrhosis in the state of Queensland, Australia, over the last decade, along with the bacterial pathogens responsible and their antibiotic susceptibility profiles.
A population-based retrospective cohort study of public hospital admissions was conducted from 1 January 2008 to 31 December 2017. Hospital admissions for patients with a diagnosis of cirrhosis were categorised by the presence or absence of sepsis/bacteraemia. Clinical and sociodemographic information including cirrhosis aetiology, complications and comorbidities, and in-hospital mortality were examined using bivariate and multivariate analyses. In patients with bacteraemia, the type and prevalence of bacteria and antibiotic resistance was assessed.
Sepsis/bacteraemia was present in 3951 of 103 165 hospital admissions with a diagnosis of cirrhosis. Factors associated with sepsis/bacteraemia included disease aetiology, particularly primary sclerosing cholangitis (adj-OR 15.09, 95% CI 12.24 to 18.60), alcohol (adj-OR 2.90, 95% CI 2.71 to 3.09), Charlson Comorbidity Index ≥3 (adj-OR 3.54, 95% CI 3.19 to 3.93) and diabetes (adj-OR 1.87, 95% CI 1.74 to 2.01). Overall case-fatality rate among admissions with sepsis/bacteraemia was 27.7% (95% CI 26.3% to 29.1%) vs 3.7% (95% CI 3.6% to 3.8%) without sepsis/bacteraemia. In-hospital death was significantly associated with sepsis/bacteraemia (adj-OR 6.50, 95% CI 5.95 to 7.11). The most common organisms identified were Escherichia coli and Staphylococcus aureus, present in 22.9% and 18.1%, respectively, of the 2265 admissions with a positive blood culture. The prevalence of MDR bacteria was low (5.6%)
Morbidity and mortality related to sepsis/bacteraemia in patients with cirrhosis remains a critical clinical problem.
Colonoscopy withdrawal time (CWT) is a key performance indicator affecting polyp detection rate (PDR) and adenoma detection rate (ADR). However, studies have shown wide variation in CWT and ADR between different endoscopists. The National Endoscopy Database (NED) was implemented to enable quality assurance in all endoscopy units across the UK and also to reduce variation in practice. We aimed to assess whether CWT changed since the introduction of NED and whether CWT affected PDR.
We used NED to retrospectively collect data regarding CWT and PDR of 25 endoscopists who performed (n=4459 colonoscopies) in the four quarters of 2019. We then compared this data to their performance in 2016, before using NED (n=4324 colonoscopies).
Mean CWT increased from 7.66 min in 2016 to 9.25 min in 2019 (p=0.0001). Mean PDR in the two periods was 29.9% and 28.3% (p=0.64). 72% of endoscopists (18/25) had CWT>6 min in 2016 versus 100% (25/25) in 2019, the longer CWT in 2019 positively correlated with the PDR (r=0.50, p=0.01). Gastroenterology consultants and trainee endoscopists had longer CWT compared with colorectal surgeons both before and after using NED.
NED usage increased withdrawal times in colonoscopy. Longer withdrawal times were associated with higher PDR. A national colonoscopy audit using data from NED is required to evaluate whether wide variations in practice across endoscopy units in the UK still exist and to ensure minimum colonoscopy quality standards are achieved.
The decision regarding whether to perform a liver biopsy in patients with cirrhosis and clinically suspected autoimmune hepatitis (AIH) remains a challenge. This study aimed to assess the utility and complications of percutaneous liver biopsy in cirrhosis for differentiating AIH from other liver conditions.
A clinicopathological database of patients undergoing percutaneous liver biopsies for suspected AIH (unexplained hepatitis with elevated -globulin and autoantibody seropositivity) was reviewed to identify patients presenting with cirrhosis. Biopsy slides were reviewed by an experienced hepatopathologist who was blinded to clinical data.
In 207 patients who underwent liver biopsy for suspected AIH, 59 patients (mean age: 59.0±12.0 years, 83.1% female) had clinically diagnosis of cirrhosis. Mean Child-Turcotte-Pugh score was 6.6±1.6, and 44% of patients had a Child-Turcotte-Pugh score≥7. According to the revised International AIH Group (IAIHG) criteria, histology assessment combined with clinical information facilitated a diagnosis of AIH or overlap syndrome of AIH and primary biliary cholangitis (PBC) in 81.4% of cases. Liver biopsy identified other aetiologies, including PBC (n=2), non-alcoholic steatohepatitis (n=6) and cryptogenic cirrhosis (n=3). A reliable diagnosis of AIH could be made using histological category of the simplified criteria in 69.2% and 81.8% of cases using IAIHG scores before biopsy of <10 and 10–15, respectively. Three patients with cirrhosis (5.1%) experienced bleeding following biopsy, but none of 148 patients with non-cirrhosis had bleeding complication (p=0.022).
Liver biopsy provides important diagnostic information for the management of patients with cirrhosis and suspected AIH, but the procedure is associated with significant risk.
Perianal Crohn’s disease (pCD) is a debilitating complication affecting up to 30% of Crohn’s disease (CD) population, leading to increased morbidity, mortality and decreased quality of life. Despite the growing armamentarium of medications for luminal CD, their efficacy in pCD remains poorly studied.
To determine the efficacy of ustekinumab, a biologic approved for luminal CD, in pCD through a retrospective cohort study and systematic review.
A retrospective cohort study on patients with CD with active perianal fistulae treated with ustekinumab from September 2013 to August 2019 was performed to determine perianal fistula response and remission at 6 and 12 months after ustekinumab induction. A systematic review was performed to further establish rates of fistula response and remission with ustekinumab.
At 6 months, 48.1% (13/27) patients achieved fistula response with none achieving fistula remission on provider exam, and 59.3% (16/27) achieved patient-reported symptomatic improvement with 3.7% (1/27) achieving symptomatic remission. At 1 year, on provider exam, 55.6% (5/9) had fistula response with none achieving fistula remission, and 100% (9/9) had symptomatic improvement with 22.2% (2/9) achieving symptomatic remission. There were no major safety signals during 1-year follow-up. The systematic review of 25 studies found 44% (92/209) of patients with active perianal fistulas had a clinical response within 6 months of follow-up, and 53.9% (85/152) of patients with 12 months of follow-up achieved clinical response.
Ustekinumab presents a safe and effective therapy for treatment of pCD. Prospective, randomised trials are needed to further elucidate long-term efficacy of ustekinumab for pCD.
The pathogenesis of acute cholangitis (AC) occurs with biliary obstruction followed by bacterial growth in the bile duct. The leading cause of AC is obstructing gallstones. There have been conflicting theories about the optimal timing for cholecystectomy following AC. The aim of this study is to assess the impact of early cholecystectomy on the 30-day readmission rate, 30-day mortality, 90-day readmission rate and the length of hospital stay.
This retrospective study was performed between January 2015 and January 2021 in a high-volume tertiary referral teaching hospital. Included patients were 18 years or older with a definitive diagnosis of acute gallstone cholangitis who underwent endoscopic retrograde cholangiopancreatography (ERCP) with complete clearance of the bile duct as an index procedure. We divided the patients into two groups: patients who underwent ERCP alone and those who underwent ERCP with laparoscopic cholecystectomy (LC) on the same admission (ERCP+LC). Data were extracted from electronic medical records. The primary endpoint of the study was the 30-day readmission rate.
A total of 114 patients with AC met the inclusion criteria of the study. The ERCP+LC group had significantly lower rates of 30-day readmission (2.2% vs 42.6%, p<0.001), 90-day readmission (2.2% vs 30.9%, p<0.001) and 30-day mortality (2.2% vs 16.2%, p=0.017) when compared with the ERCP group. In a multivariate logistic regression analysis, patients in the ERCP+LC group had 90% lower odds of 30-day readmission compared with patients who did not undergo LC during admission (OR=0.1, 95% CI (0.032 to 0.313), p<0.001).
Performing LC on same day admission was associated with a decrease in 30-day and 90-day readmission rate as well as 30-day mortality.
It is still controversial if increased hepatic fat independently contributes to cardiovascular risk. We aimed to assess the association between hepatic fat quantified by MRI and various subclinical vascular disease parameters.
We included two cross-sectional investigations embedded in two independent population-based studies (Study of Health in Pomerania (SHIP): n=1341; Cooperative Health Research in the Region of Augsburg (KORA): n=386). The participants underwent a whole-body MRI examination. Hepatic fat content was quantified by proton-density fat fraction (PDFF). Aortic diameters in both studies and carotid plaque-related parameters in KORA were measured with MRI. In SHIP, carotid intima-media thickness (cIMT) and plaque were assessed by ultrasound. We used (ordered) logistic or linear regression to assess associations between hepatic fat and subclinical vascular disease.
The prevalence of fatty liver disease (FLD) (PDFF >5.6%) was 35% in SHIP and 43% in KORA. In SHIP, hepatic fat was positively associated with ascending (β, 95% CI 0.06 (0.04 to 0.08)), descending (0.05 (0.04 to 0.07)) and infrarenal (0.02 (0.01 to 0.03)) aortic diameters, as well as with higher odds of plaque presence (OR, 95% CI 1.22 (1.05 to 1.42)) and greater cIMT (β, 95% CI 0.01 (0.004 to 0.02)) in the age-adjusted and sex-adjusted model. However, further adjustment for additional cardiometabolic risk factors, particularly body mass index, attenuated these associations. In KORA, no significant associations were found.
The relation between hepatic fat and subclinical vascular disease was not independent of overall adiposity. Given the close relation of FLD with cardiometabolic risk factors, people with FLD should still be prioritised for cardiovascular disease screening.
Previous studies showing an association between chronic use of proton pump inhibitor (PPI) and gastric cancer are limited by confounding by indication. This relationship has not been studied in patients receiving PPI for prophylaxis, such as those undergoing percutaneous coronary intervention (PCI).
This was a retrospective cohort study including 14 hospitals under the Hospital Authority of Hong Kong between 1 January 2004 and 31 December 2017. Participants were patients who underwent first-ever PCI, were not on PPI prescription within 30 days before admission for PCI, had no known malignancy and survived for 365 days after PCI. Propensity score matching was used to balance baseline characteristics and other prescription patterns. The primary outcome was diagnosis of gastric cancer made >365 days after PCI as a time-to-first-event analysis. The secondary outcome was death from gastric cancer.
Among the 13 476 patients (6738 pairs) matched by propensity score, gastric cancer developed in 17 (0.25%) PPI users and 7 (0.10%) PPI non-users after a median follow-up of 7.1 years. PPI users had a higher risk of gastric cancer (HR 3.55; 95% CI 1.46 to 8.66, p=0.005) and death from gastric cancer (HR 4.18; 95% CI 1.09 to 16.08, p=0.037), compared with non-users. The association was duration-dependent and patients who took PPI for ≥365 days were at increased risk.
Chronic use of PPI was significantly associated with increased risk of gastric cancer and death from gastric cancer in patients for whom it was prescribed as prophylaxis. Physicians should judiciously assess the relevant risks and benefits of chronic PPI use before prescription.
COVID-19 pandemic has globally affected healthcare including the transplantation programmes.
We retrospectively studied the impact of COVID-19 on live liver donor (LLD) programme at liver transplant centre in Gambat, Pakistan. Standard operative procedures (SOPs) including COVID-19 nasopharyngeal swab PCR, CT scans, personal protective equipment use, 6-feet distancing were developed for LLD and transplant team to mitigate COVID-19 exposure. We compared the complications, healthcare utilisation (hospital stay, readmission) and mortality between two LLD cohorts—before and during COVID-19 pandemic from March 2019 to December 2020.
During study period 300 LLD surgeries were performed. There was an increase in rate of LLDs from 132 (44%) in pre-COVID to 168 (56%) during COVID-19 era. Average numbers of transplants per month performed during pre-COVID and during COVID-19 era were 10.1 and 14, respectively. No donor has developed COVID-19 infection during hospitalisation. Rate of all LLD complications (32 (21.47%) and 49 (29.16%), p=0.43), uneventful discharges (120/168 (71.4%) and 88/132 (66.6%), p<0.05), mean hospital stay (6±2 days and 5±2 days, p=0.17) and readmission (5 (4%) and 3 (1.8%), p=0.43) were similar during the pre-COVID and COVID-19 era. No donor mortality was observed during study period.
With the implementation of mindful SOPs, rate of LLD increased without any case of COVID-19 infection. Our SOPs were helpful in continuation of LLD programme in a developing country during COVID-19 pandemic.
A minimum of physical activity and low liquid intake are factors that have been associated with constipation. The health emergency brought on by the COVID-19 pandemic has resulted in adopting behaviour, such as sheltering-in-place (less mobility) and dietary changes, creating a scenario we believe to be an adequate model for examining the appearance of symptoms of constipation and its associated factors.
A cross-sectional and descriptive study was conducted on an open population, applying an electronic survey (4 weeks after lockdown due to COVID-19 in Mexico) to evaluate demographic characteristics, physical activity, water and fibre intake, appearance of constipation symptoms (including stool consistency), and quality of life.
Out of 678 subjects evaluated, 170 (25%, 95% CI: 21.7 to 28.4) developed symptoms of ‘new-onset’ constipation, with a significant decrease in the number of daily bowel movements (p<0.05) and stool consistency (p<0.05) during lockdown. Furthermore, in the ‘new-onset’ constipation population there was a higher proportion of subjects (79 (47%) of 170) who stopped exercising during the pandemic compared with the subjects who did not develop constipation symptoms (187 (37%) of 508, p=0.03, OR: 1.49, 95% CI: 1.0 to 2.1). The multivariate analysis (logistic regression) showed that female sex (p=0.001), water intake (p=0.039), and physical activity (p=0.012) were associated with ‘new-onset’ constipation.
In our study on an open population in Mexico, we found that one-fourth of the population developed ‘new-onset’ constipation symptoms during the lockdown imposed due to the COVID-19 pandemic. A reduction of physical activity and less water consumption were associated factors.
Adequate bowel preparation is a prerequisite for effective colonoscopy. Split bowel preparation results in optimal cleansing. This study assessed the bowel preparation regimes advised by endoscopy units across the UK, and correlated the differences with outcomes.
Trusts in the UK were surveyed, with data requested between January 2018 and January 2019, including: the type and timing of preparation, pre-endoscopy diet, adequacy rates and polyp detection. Trusts were grouped according to the timing of bowel preparation. 2 test was used to assess for differences in bowel preparation adequacy.
Moviprep was the first line bowel preparation in 79% of trusts. Only 7% of trusts advised splitting bowel preparation for all procedures, however, 91% used split bowel preparation for afternoon procedures. Trusts that split preparation for all procedures had an inadequacy rate of 6.7%, compared with 8.5% (p<0.001) for those that split preparation for PM procedures alone and 9.5% (p<0.001) for those that provided day before preparation for all procedures. Morning procedures with day-before preparation had a higher rate of inadequate cleansing than afternoon procedures that received split preparation (7.7% vs 6.5 %, p<0.001). The polyp detection rate for procedures with adequate preparation was 37.1%, compared with 26.4% for those that were inadequate.
Most trusts in the UK do not provide instructions optimising the timing of bowel preparation prior to colonoscopy. This correlated with an increased rate of inadequate cleansing. Splitting bowel preparation is likely to reduce the impacts of poor cleansing: missed lesions, repeat colonoscopies and significant costs.
In early 2019, a new coronavirus called SARS-CoV-2 emerged and changed the course of civilization. Our study aims to analyze the association between acute liver failure (ALF) and mortality in patients infected with COVID-19. A retrospective analysis of 864 COVID-19-infected patients admitted to Nassau University Medical Center in New York was performed.
ALF is identified by acute liver injury (elevations in liver enzymes), hepatic encephalopathy and an international normalised ratio greater than or equal to 1.5. These parameters were analysed via daily blood work and clinical assessment. Multivariate logistic regression model predicting mortality and controlling for confounders such as age, coronary artery disease, intubation, hypertension, diabetes mellitus and acute kidney injury were used to determine the association of ALF with mortality.
A total of 624 patients, out of the initial 864, met the inclusion criteria—having acute hepatitis and COVID-19 infection. Of those 624, 43 (6.9%) patients developed ALF during the course of their hospitalisation and their mortality rate was 74.4%. The majority of patients with ALF were male (60.6%). The logistic model predicting death and controlling for confounders shows COVID-19 patients with ALF had a nearly four-fold higher odds of death in comparison to those without ALF (p=0.0063).
Findings from this study suggest that there is a significant association between mortality and the presence of ALF in patients infected with COVID-19. Further investigation into patients with COVID-19 and ALF can lead to enhanced treatment regimens and risk stratification tools, which can ultimately improve mortality rates during these arduous times.
COVID-19 continues to pose a significant healthcare challenge throughout the world. Comorbidities including diabetes and hypertension are associated with a significantly higher mortality risk. However, the effect of cirrhosis on COVID-19 outcomes has yet to be systematically assessed.
To assess the reported clinical outcomes of patients with cirrhosis who develop COVID-19 infection.
PubMed and EMBASE databases were searched for studies included up to 3 February 2021. All English language primary research articles that reported clinical outcomes in patients with cirrhosis and COVID-19 were included. The study was conducted and reported in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines. The risk of bias was assessed using the Quality In Prognostic Score (QUIPS) risk-of-bias assessment instrument for prognostic factor studies template. Meta-analysis was performed using Cochrane RevMan V.5.4 software using a random effects model.
63 studies were identified reporting clinical outcomes in patients with cirrhosis and concomitant COVID-19. Meta-analysis of cohort studies which report a non-cirrhotic comparator yielded a pooled mortality OR of 2.48 (95% CI: 2.02 to 3.04). Analysis of a subgroup of studies reporting OR for mortality in hospitalised patients adjusted for significant confounders found a pooled adjusted OR 1.81 (CI: 1.36 to 2.42).
Cirrhosis is associated with an increased risk of all-cause mortality in COVID-19 infection compared to non-cirrhotic patients. Patients with cirrhosis should be considered for targeted public health interventions to prevent COVID-19 infection, such as shielding and prioritisation of vaccination.
The rapid growth of the probiotic industry suggests patients will continue to seek advice from gastroenterologists about probiotics. To best address patient questions and concerns, we must first understand who uses probiotics and why.
This was a cross-sectional study conducted in the endoscopy suite of an academic hospital from June to October of 2019. Surveys were anonymous and contained a combination of multiple choice, free text and Likert scale questions. Participants privately completed a paper survey in English or Spanish and the results were reviewed with them by study personnel to clarify responses. Descriptive statistics were generated and multivariable logistic regression modelling was used to compare characteristics of probiotic users versus non-users.
During the 5-month study period, 600 patients were approached and 537 (90%) agreed to participate. Among participants, 89% completed at least 24 survey items and were included in the analysis. Overall, 27% of patients reported probiotic use. Bloating, rather than diarrhoea, was the main gastrointestinal symptom associated with use of probiotics (aOR 2.59, 95% CI 1.52 to 4.44 for bloating; aOR 1.03, 95% CI 0.55 to 1.94 for diarrhoea). Frequent reasons cited for taking probiotics were the beliefs that they improved overall health and longevity (54%) and that they improved gastrointestinal symptoms (45%).
Probiotic use is common among general gastroenterology patients, many of whom believe that probiotics confer general rather than specific gastrointestinal health benefits. Symptoms—especially bloating—and not sociodemographic factors seem to motivate probiotic use. By understanding patient expectations for probiotics, clinicians can better advise them.
Exclusive enteral nutrition (EEN) is a potentially effective but underused therapy for Crohn’s disease (CD) in adults. It is first-line induction treatment for paediatric patients but remains a second-line or third-line therapy in adults.
To analyse the evidence for EEN in adult patients with CD, and summarise this in a narrative review.
In April/May 2020 and July 2021, a literature search was performed using the Medical Subject Headings (MeSH) terms: ‘Crohn’s disease’, ‘CD’, ‘inflammatory bowel disease’, ‘IBD’, ‘exclusive enteral nutrition’, ‘enteral nutrition’, ‘EEN’, in PubMed, Scopus, Cochrane. Additional studies were obtained from references of search result articles as well as general reading. Studies with adult patients with CD treated with EEN were selected. 79 articles of relevance were found. Where data in adults were lacking, data from paediatric studies as extrapolated with care.
EEN in adult patients been shown to improve clinical, biomarker, endoscopic and radiologic measures of disease activity. EEN avoids the potential adverse effects of recurrent corticosteroids for induction such as metabolic derangements and opportunistic infections. EEN has also demonstrated benefits among adult patients with fistulising and stricturing CD. It may avoid surgery in such patients. Preoperative EEN has also been shown to reduce postoperative complications and recurrence. There appears to be benefits in combing EEN with antitumour necrosis factor agents, however, benefits of combination therapy with other biologics are less clear. A major drawback of EEN therapy in adults has been poor compliance. More palatable polymeric formulations improved patient education and dietitian support may overcome this. Evidence in adults is limited to small studies, often with suboptimal control arms and lack of blinding. Larger scale studies with improved study design are needed to confirm these beneficial effects.
Despite limitations in evidence EEN should be considered in treating adults with CD.
Capsule endoscopy (CE) is pivotal for evaluation of small bowel disease. Obscure gastrointestinal bleeding most often originates from the small bowel. CE frequently identifies a wide range of lesions with different bleeding potentials in these patients. However, reading CE examinations is a time-consuming task. Convolutional neural networks (CNNs) are highly efficient artificial intelligence tools for image analysis. This study aims to develop a CNN-based model for identification and differentiation of multiple small bowel lesions with distinct haemorrhagic potential using CE images.
We developed, trained, and validated a denary CNN based on CE images. Each frame was labelled according to the type of lesion (lymphangiectasia, xanthomas, ulcers, erosions, vascular lesions, protruding lesions, and blood). The haemorrhagic potential was assessed by Saurin’s classification. The entire dataset was divided into training and validation sets. The performance of the CNN was measured by the area under the receiving operating characteristic curve, sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV).
A total of 53 555 CE images were included. The model had an overall accuracy of 99%, a sensitivity of 88%, a specificity of 99%, a PPV of 87%, and an NPV of 99% for detection of multiple small bowel abnormalities and respective classification of bleeding potential.
We developed and tested a CNN-based model for automatic detection of multiple types of small bowel lesions and classification of the respective bleeding potential. This system may improve the diagnostic yield of CE for these lesions and overall CE efficiency.
To our knowledge, the current report is the first description of enteral budesonide treatment of duodenitis in a patient with COVID-19 infection and warrants further investigation, whether budesonide might constitute a novel therapeutic strategy for the management of COVID-19-related intestinal mucosal damage.
]]>Patients with primary biliary cholangitis (PBC) have an impaired health-related quality of life (HRQoL). Practice guidelines recommend evaluating the HRQoL in all patients with PBC. The aim of this study was to assess the reliability and validity of our Dutch translation of the PBC-40, a PBC-specific measure of the HRQoL.
The PBC-40 was translated into Dutch following standardised forward–backward procedures. Participants received the Dutch PBC-40 and the RAND-36 (a validated Dutch version of the 36-Item Short Form Health Survey) through postal mail. The PBC-27 is an abridged version of the PBC-40. Internal consistency between the items within the PBC-40/PBC-27 domains was assessed by Cronbach’s alpha. In addition, score distributions were analysed on floor and ceiling effects. Construct validity was assessed by hypotheses testing using Pearson’s correlation between the PBC-40/PBC-27 domains and RAND-36 scales.
177 patients with PBC were included. The mean age was 61.1 (SD 9.9) years and the majority of patients was female (n=164, 92.7%). From the 7080 PBC-40 items, 61 items (0.9%) were missing and 342 items (4.8%) were answered with the ‘does not apply’ option. Each PBC-40 domain had a Cronbach’s α of >0.70, with the highest in the domain fatigue (0.95). For the PBC-27, the lowest Cronbach’s α was 0.69. Floor effects were present in three domains (cognition 19.3%, itch 27.0% and social 25.0% (only for PBC-27)). No ceiling effects were observed. All domains were significantly correlated with the corresponding RAND-36 scale(s) (p<0.001 for all). The strongest correlation was between the PBC-40 domain fatigue and the RAND-36 vitality scale (r=–0.834).
Our findings demonstrate the reliability and validity of the Dutch PBC-40 and PBC-27 for the assessment of the HRQoL in patients with PBC. This PBC-specific measure can be used in Dutch-speaking patients with PBC for both research and clinical purposes.
Clostriodiodes difficile infection (CDI) is a major cause of healthcare-associated diarrhoea with high mortality. There is a lack of validated predictors for severe outcomes in CDI. The aim of this study is to derive and validate a clinical prediction tool for CDI in-hospital mortality using a large critical care database.
The demographics, clinical parameters, laboratory results and mortality of CDI were extracted from the Medical Information Mart for Intensive Care-III (MIMIC-III) database. We subsequently trained three machine learning models: logistic regression (LR), random forest (RF) and gradient boosting machine (GBM) to predict in-hospital mortality. The individual performances of the models were compared against current severity scores (Clostridiodes difficile Associated Risk of Death Score (CARDS) and ATLAS (Age, Treatment with systemic antibiotics, leukocyte count, Albumin and Serum creatinine as a measure of renal function) by calculating area under receiver operating curve (AUROC). We identified factors associated with higher mortality risk in each model.
From 61 532 intensive care unit stays in the MIMIC-III database, there were 1315 CDI cases. The mortality rate for CDI in the study cohort was 18.33%. AUROC was 0.69 (95% CI, 0.60 to 0.76) for LR, 0.71 (95% CI, 0.62 to 0.77) for RF and 0.72 (95% CI, 0.64 to 0.78) for GBM, while previously AUROC was 0.57 (95% CI, 0.51 to 0.65) for CARDS and 0.63 (95% CI, 0.54 to 0.70) for ATLAS. Albumin, lactate and bicarbonate were significant mortality factors for all the models. Free calcium, potassium, white blood cell, urea, platelet and mean blood pressure were present in at least two of the three models.
Our machine learning derived CDI in-hospital mortality prediction model identified pertinent factors that can assist critical care clinicians in identifying patients at high risk of dying from CDI.
Gastroenteropancreatic neuroendocrine tumours (GEP-NETs) encompass a diverse group of neoplasms that vary in their secretory products and in their location within the gastrointestinal tract. Their prevalence in the USA is increasing among all adult age groups.
To identify the possible derivation of GEP-NETs using genome-wide analyses to distinguish small intestinal neuroendocrine tumours, specifically duodenal gastrinomas (DGASTs), from pancreatic neuroendocrine tumours.
Whole exome sequencing and RNA-sequencing were performed on surgically resected GEP-NETs (discovery cohort). RNA transcript profiles available in the Gene Expression Omnibus were analysed using R integrated software (validation cohort). Digital spatial profiling (DSP) was used to analyse paraffin-embedded GEP-NETs. Human duodenal organoids were treated with 5 or 10 ng/mL of tumor necrosis factor alpha (TNFα) prior to qPCR and western blot analysis of neuroendocrine cell specification genes.
Both the discovery and validation cohorts of small intestinal neuroendocrine tumours induced expression of mesenchymal and calcium signalling pathways coincident with a decrease in intestine-specific genes. In particular, calcium-related, smooth muscle and cytoskeletal genes increased in DGASTs, but did not correlate with MEN1 mutation status. Interleukin 17 (IL-17) and tumor necrosis factor alpha (TNFα) signalling pathways were elevated in the DGAST RNA-sequencing. However, DSP analysis confirmed a paucity of immune cells in DGASTs compared with the adjacent tumour-associated Brunner’s glands. Immunofluorescent analysis showed production of these proinflammatory cytokines and phosphorylated signal transducer and activator of transcription 3 (pSTAT3) by the tumours and stroma. Human duodenal organoids treated with TNFα induced neuroendocrine tumour genes, SYP, CHGA and NKX6.3.
Stromal–epithelial interactions induce proinflammatory cytokines that promote Brunner’s gland reprogramming.
Lung complications occur in 0.5% of the millions of blind tube placements. This represents a major health burden. Use of a Kangaroo feeding tubes with an ‘integrated real-time imaging system’ (‘IRIS’ tube) may pre-empt such complications. We aimed to produce a preliminary operator guide to IRIS tube placement and interpretation of position.
In a single centre, IRIS tubes were prospectively placed in intensive care unit patients. Characteristics of tube placement and visualised anatomy were recorded in each organ to produce a guide.
Of 45 patients having one tube placement, 3 were aborted due to refusal (n=1) or inability to enter the oesophagus (n=2). Of 43 tubes placed beyond 30 cm, 12 (28%) initially entered the respiratory tract but all were withdrawn before reaching the main carina. We identified anatomical markers for the nasal or oral cavity (97.8%), respiratory tract (100%), oesophagus (97.6%), stomach (100%) and intestine (100%). Organ differentiation was possible in 100%: trachea-oesophagus, oesophagus-stomach and stomach-intestine. Gastric tube position was confirmed by aspiration of fluid with a pH
By permitting real-time confirmation of tube position, direct vision may reduce risk of lung complications. The preliminary operator guide requires validation in larger studies.
During COVID-19 pandemic, the safety of medical therapies for inflammatory bowel disease (IBD) in relation to COVID-19 has emerged as an area of concern. This study aimed to evaluate the association between IBD therapies and severe COVID-19 outcomes.
We performed a systematic review and meta-analysis of all published studies from December 2019 to August 2021 to identify studies that reported severe COVID-19 outcomes in patients on current IBD therapies including 5-aminosalicylic acid (5-ASA), immunomodulators, corticosteroids, biologics, combination therapy, or tofacitinib.
Twenty-two studies were identified. Corticosteroids (risk ratio (RR) 1.91 (95% CI 1.25 to 2.91, p=0.003)) and 5-ASA (RR 1.50 (95% CI 1.17 to 1.93, p=0.001)) were associated with increased risk of severe COVID-19 outcomes in patients with IBD patients. However, possible confounders for 5-ASA use were not controlled for. Sub-analysis showed that corticosteroids increased the risk of intensive care unit (ICU) admission but not mortality. Immunomodulators alone (RR 1.18 (95% CI 0.87 to 1.59, p=0.28)) or in combination with anti-TNFs ((RR 0.96 (95% CI 0.80 to 1.15, p=0.63)), tofacitinib (RR 0.81 (95% CI 0.49 to 1.33, p=0.40)) and vedolizumab ((RR 1.02 (95% CI 0.79 to 1.31, p=0.89)) were not associated with severe disease. Anti-TNFs (RR 0.47 (95% CI 0.40 to 0.54, p<0.00001)) and ustekinumab (RR 0.55 (95% CI 0.43 to 0.72, p<0.00001)) were associated with decreased risk of severe COVID-19.
In patients with IBD, the risk of severe COVID-19 is higher among patients receiving corticosteroids. Corticosteroid use was associated with ICU admission but not mortality. The risk is also higher among patients receiving 5-ASAs. However, patient-level data were lacking and insufficient data existed for meta-regression analyses to adjust for confounding. Vedolizumab, tofacitinib, and immunomodulators alone or in combination with anti-TNF were not associated with severe disease. Anti-TNFs, and ustekinumab were associated with favourable outcomes.
Limited literature has examined the epidemiology of non-alcoholic fatty liver disease (NAFLD) and fibrosis among young adults in Egypt, a country with one of the highest obesity rates globally. We assessed the prevalence of steatosis and fibrosis among college students in Egypt.
In this cross-sectional study, we recruited students unaware of having fatty liver via a call-for-participation at a private university in the Dakahlia governorate of Egypt. Primary outcomes were the prevalence of steatosis as determined by the controlled attenuation parameter component of transient elastography and fibrosis as determined by the liver stiffness measurement component of transient elastography. Secondary outcomes were clinical parameters and socioeconomic factors associated with the presence and severity of steatosis and fibrosis.
Of 132 participants evaluated for the study, 120 (91%) were included (median (IQR) age, 20 (19–21) years; 65 (54.2%) female). A total of 38 participants (31.6%) had steatosis, among whom 22 (57.9%) had S3 (severe) steatosis. There was a higher risk for steatosis in persons with overweight (adjusted OR 9.67, 95% CI (2.94 to 31.7, p<0.0001) and obesity (adjusted OR 13.87, 95% CI 4.41 to 43.6, p<0.0001) compared with lean persons. Moreover, higher level of parental education was associated with progressing steatosis stages (S1–S3). Six (5%) participants had transient elastography values equivalent to F2–F3 fibrosis (four with F2 fibrosis (≥7.9 kPa), and two with F3 fibrosis (≥8.8 kPa)).
In this cohort of college students in Egypt, around 1 in 3 had steatosis, and 1 in 20 had moderate-to-advanced fibrosis, an established risk factor for hepatic and extrahepatic morbidity and mortality. These data underscore the urgency to address the silent epidemic of NAFLD among young adults in the Middle East-North Africa region.
Patients with Crohn’s disease (CD) may develop fibrostenotic strictures. No currently available therapies prevent or treat fibrostenotic CD (FCD), making this a critical unmet need.
To compare health outcomes and resource utilisation between CD patients with and without fibrostenotic disease.
Patients aged ≥18 years with FCD and non-FCD between 30 October 2015 and 30 September 2018 were identified in the Truven MarketScan Commercial Claims and Encounters Database. We conducted 1:3 nearest neighbour propensity score matching on age, sex, malnutrition, payer type, anti-tumour necrosis factor use, and Charlson Comorbidity Index score. Primary outcomes up to 1 year from the index claim were ≥1 hospitalisation, ≥1 procedure, ≥1 surgery, and steroid dependency (>100 day supply). Associations between FCD diagnosis and outcomes were estimated with a multivariable logistic regression model. This study was exempt from institutional review board approval.
Propensity score matching yielded 11 022 patients. Compared with non-FCD, patients with FCD had increased likelihood of hospitalisations (17.1% vs 52.4%; p<0.001), endoscopic procedures (4.4% vs 8.6%; p<0.001), IBD-related surgeries (4.7% vs 9.1%; p<0.001), steroid dependency (10.0% vs 15.7%; p<0.001), and greater mean annual costs per patient ($47 575 vs $77 609; p<0.001). FCD was a significant risk factor for ≥1 hospitalisation (adjusted OR (aOR), 6.1), ≥1 procedure (aOR, 2.1), ≥1 surgery (aOR, 2.0), and steroid dependency (aOR, 1.7).
FCD was associated with higher risk for hospitalisation, procedures, abdominal surgery, and steroid dependency. Patients with FCD had a greater mean annual cost per patient. FCD represents an ongoing unmet medical need.
Prevalent type 2 diabetes (T2D) is associated with an increased risk of colorectal cancer and could impair the quality of bowel preparation for colonoscopy. This may in turn increase the risk of overlooked precancerous polyps and subsequent risk of post-colonoscopy colorectal cancer (PCCRC). We investigated whether patients with T2D are at increased risk of PCCRC compared with patients without T2D.
We conducted a population-based cohort study of patients with T2D and without T2D undergoing colonoscopy in Denmark (1995–2015). We investigated the risk of PCCRC by calculating >6 to 36 months cumulative incidence proportions (CIPs) treating death and colectomy as competing risks. Using Cox proportional-hazards regression analyses, we also computed HRs of PCCRC, comparing patients with T2D and non-T2D. According to the World Endoscopy Organization guidelines, we calculated PCCRC 3-year rates to estimate the proportions of T2D and non-T2D CRC patients experiencing PCCRC.
We identified 29 031 patients with T2D and 333 232 patients without T2D undergoing colonoscopy. We observed 250 PCCRCs among patients with T2D and 1658 PCCRCs among patients without T2D. The >6 to 36 months CIP after a first-time colonoscopy was 0.64% (95% CI 0.55% to 0.74%) for T2D and 0.36% (95% CI 0.34% to 0.38%) for patients without T2D. The HRs of PCCRC were 1.43 (95% CI 1.21 to 1.72) after a first-time colonoscopy and 1.18 (95% CI 0.75 to 1.85) after a second-time colonoscopy. The PCCRC 3-year rate was 7.9% for patients with T2D and 7.4% for patients without T2D.
T2D may be associated with an increased HR of PCCRC.
Infliximab is an efficacious therapy for inflammatory bowel disease and may play a role in management of some extraintestinal manifestations. While higher trough levels of infliximab are associated with higher rates of disease remission, the association between trough levels of infliximab and arthralgia activity characterised as an extraintestinal manifestation has yet to be defined.
We aimed to assess the association between serum trough levels of infliximab and peripheral arthralgia activity in patients with inflammatory bowel disease.
In this cross-sectional study, we identified patients with inflammatory bowel disease on infliximab therapy with known history of arthralgias attributed to an extraintestinal manifestation. Collected variables included disease phenotype, medications (such as thiopurines or methotrexate), Harvey Bradshaw Index, partial Mayo score, C reactive protein, trough levels of infliximab and anti-infliximab antibodies. The primary outcome was active patient-reported arthralgia.
Out of 267 patients included, 65 (24.4%) had active arthralgias at the time the trough level of infliximab was measured. No significant differences in trough levels were seen between those patients with and without arthralgias. Patients on combination therapy with methotrexate or thiopurines or those with detectable anti-infliximab antibodies were not more likely to have inactive arthralgias (OR 0.99, 95% CI 0.57 to 1.74, p=0.99 and OR 1.94, 95% CI 0.9 to 4.1, p=0.09, respectively).
This study suggests that although therapeutic drug monitoring of infliximab can have a role in the management of Crohn’s disease and ulcerative colitis, it does not seem to be useful in managing arthralgias associated with inflammatory bowel disease.
Poor sleep is common in inflammatory bowel disease (IBD), associated with worse overall disease course and predominantly attributable to insomnia. While cognitive–behavioural therapy for insomnia (CBT-I) is the recommended first-line treatment for chronic insomnia, it is untested in IBD. It is unclear if CBT-I will be as effective in this group given the extent of night-time symptoms people with IBD experience. Thus, we evaluated the feasibility and preliminary efficacy of CBT-I in IBD.
We comprehensively assessed sleep in people with mild-to-moderately active IBD using questionnaires, daily diaries and actigraphy. People with significant insomnia symptoms were allocated to a single-arm, uncontrolled pilot feasibility study of gold-standard CBT-I treatment. They were then reassessed post-treatment.
20 participants with IBD completed a baseline assessment. 10 were experiencing insomnia and were allocated to CBT-I. All participants who were offered CBT-I elected to complete it, and all completed 5/5 sessions. Participants rated treatment acceptability highly and daily diary and actigraphy completion rates were
CBT-I was feasible and acceptable and demonstrated a signal for efficacy in the treatment of insomnia in IBD. Importantly, the improvements in sleep continuity were consistent with the extant literature. Future fully powered randomised controlled studies should evaluate whether treatment of insomnia can improve other aspects of IBD, including pain and inflammation.
The COVID-19 pandemic significantly impacted on the provision of oesophageal physiology investigations. During the recovery phase, triaging tools were empirically recommended by national bodies for prioritisation of referrals amidst rising waiting lists and reduced capacity. We evaluated the performance of an enhanced triage process (ETP) consisting of telephone triage combined with the hierarchical ‘traffic light system’ recommended in the UK for prioritising oesophageal physiology referrals.
In a cross-sectional study of patients referred for oesophageal physiology studies at a tertiary centre, data were compared between patients who underwent oesophageal physiology studies 6 months prior to the COVID-19 pandemic and those who were investigated within 6 months after service resumption with implementation of the ETP.
Adjusted time from referral to investigation; non-attendance rates; the detection of Chicago Classification (CC) oesophageal motility disorders on oesophageal manometry and severity of acid reflux on 24 hours pH/impedance monitoring.
Following service resumption, the ETP reduced non-attendance rates from 9.1% to 2.8% (p=0.021). Use of the ‘traffic light system’ identified a higher proportion of patients with CC oesophageal motility disorders in the ‘amber’ and ‘red’ triage categories, compared with the ‘green’ category (p=0.011). ETP also reduced the time to test for those who were subsequently found to have a major CC oesophageal motility diagnosis compared with those with minor CC disorders and normal motility (p=0.004). The ETP did not affect the yield or timing of acid reflux studies.
ETPs can effectively prioritise patients with oesophageal motility disorders and may therefore have a role beyond the current pandemic.
Indigo naturalis (IN) is an herbal medicine that has been used for ulcerative colitis with an unclear mechanism of action. Indigo and indirubin, its main constituents, are ligands of the aryl hydrocarbon receptor (AhR). We assessed the safety, efficacy, and colon AhR activity of IN given orally to patients with treatment-refractory ulcerative colitis. The role of AhR in IN benefit was further evaluated with an AhR antagonist in a murine colitis model.
This open-label, dose-escalation study sequentially treated 11 patients with ulcerative colitis with either IN 500 mg/day or 1.5 g/day for 8 weeks, followed by a 4-week non-treatment period. The primary efficacy endpoint was clinical response at week 8, assessed by total Mayo score. Secondary endpoints included clinical remission, Ulcerative Colitis Endoscopic Index of Severity, quality of life, and colon AhR activity measured by cytochrome P450 1A1 (CYP1A1) RNA expression.
Ten of 11 (91%) patients, including 8/9 (89%) with moderate-to-severe disease, achieved a clinical response. Among these 10 patients, all had failed treatment with 5-aminosalicylic acid, 8 patients with a tumour necrosis factor (TNF)-alpha inhibitor, and 6 patients with TNF-alpha inhibitor and vedolizumab. Five patients were corticosteroid dependent. Clinical response was observed in all five patients who had been recommended for colectomy. Three patients achieved clinical remission. All patients experienced improved endoscopic severity and quality of life. Four weeks after treatment completion, six patients had worsened partial Mayo scores. Four patients progressed to colectomy after study completion. Colon CYP1A1 RNA expression increased 12 557-fold at week 8 among six patients evaluated. No patient discontinued IN due to an adverse event. Concomitant administration of 3-methoxy-4-nitroflavone, an AhR antagonist, in a murine colitis model abrogated the benefit of IN.
IN is a potentially effective therapy for patients with treatment-refractory ulcerative colitis. This benefit is likely through AhR activation.
Transjugular intrahepatic portosystemic shunt (TIPS) placement is a well-established but technically challenging procedure for the management of sequelae of end-stage liver disease. Performed essentially blindly, traditional fluoroscopically guided TIPS placement requires multiple needle passes and prolonged radiation exposure to achieve successful portal venous access, thus increasing procedure time and the risk of periprocedural complications. Several advanced image-guided portal access techniques, including intracardiac echocardiography (ICE)-guided access, cone-beam CT (CBCT)-guided access and wire-targeting access techniques, can serve as alternatives to traditional CO2 portography-based TIPS creation.
A literature search was performed on the electronic databases including MEDLINE and Embase, from 2000 to the present to identify all relevant studies. The reference list also included studies identified manually, and studies referenced for other purposes.
The main benefit of these advanced access techniques is that they allow the operator to avoid essentially blind portal punctures, and the ability to visualise the target, thus reducing the number of required needle passes. Research has shown that ICE-guided access can decrease the radiation exposure, procedure time and complication rate in patients undergoing TIPS placement. This technique is particularly useful in patients with challenging portal venous anatomy. However, ICE-guided access requires additional equipment and possibly a second operator. Other studies have shown that CBCT-guided access, when compared with traditional fluoroscopy-guided access, provides superior visualisation of the anatomy with similar amount of radiation exposure and procedure time. The wire-targeting technique, on the other hand, appears to offer reductions in procedure time and radiation exposure by enabling real-time guidance. However, this technique necessitates percutaneous injury to the liver parenchyma in order to place the target wire.
Advanced portal access techniques have certain advantages over the traditional fluoroscopically guided TIPS access. To date, few studies have compared these advanced guided access options, and further research is required.