Search results for: sensitivity analyses
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5185

Search results for: sensitivity analyses

4855 Multicenter Evaluation of the ACCESS HBsAg and ACCESS HBsAg Confirmatory Assays on the DxI 9000 ACCESS Immunoassay Analyzer, for the Detection of Hepatitis B Surface Antigen

Authors: Vanessa Roulet, Marc Turini, Juliane Hey, Stéphanie Bord-Romeu, Emilie Bonzom, Mahmoud Badawi, Mohammed-Amine Chakir, Valérie Simon, Vanessa Viotti, Jérémie Gautier, Françoise Le Boulaire, Catherine Coignard, Claire Vincent, Sandrine Greaume, Isabelle Voisin

Abstract:

Background: Beckman Coulter, Inc. has recently developed fully automated assays for the detection of HBsAg on a new immunoassay platform. The objective of this European multicenter study was to evaluate the performance of the ACCESS HBsAg and ACCESS HBsAg Confirmatory assays† on the recently CE-marked DxI 9000 ACCESS Immunoassay Analyzer. Methods: The clinical specificity of the ACCESS HBsAg and HBsAg Confirmatory assays was determined using HBsAg-negative samples from blood donors and hospitalized patients. The clinical sensitivity was determined using presumed HBsAg-positive samples. Sample HBsAg status was determined using a CE-marked HBsAg assay (Abbott ARCHITECT HBsAg Qualitative II, Roche Elecsys HBsAg II, or Abbott PRISM HBsAg assay) and a CE-marked HBsAg confirmatory assay (Abbott ARCHITECT HBsAg Qualitative II Confirmatory or Abbott PRISM HBsAg Confirmatory assay) according to manufacturer package inserts and pre-determined testing algorithms. False initial reactive rate was determined on fresh hospitalized patient samples. The sensitivity for the early detection of HBV infection was assessed internally on thirty (30) seroconversion panels. Results: Clinical specificity was 99.95% (95% CI, 99.86 – 99.99%) on 6047 blood donors and 99.71% (95%CI, 99.15 – 99.94%) on 1023 hospitalized patient samples. A total of six (6) samples were found false positive with the ACCESS HBsAg assay. None were confirmed for the presence of HBsAg with the ACCESS HBsAg Confirmatory assay. Clinical sensitivity on 455 HBsAg-positive samples was 100.00% (95% CI, 99.19 – 100.00%) for the ACCESS HBsAg assay alone and for the ACCESS HBsAg Confirmatory assay. The false initial reactive rate on 821 fresh hospitalized patient samples was 0.24% (95% CI, 0.03 – 0.87%). Results obtained on 30 seroconversion panels demonstrated that the ACCESS HBsAg assay had equivalent sensitivity performances compared to the Abbott ARCHITECT HBsAg Qualitative II assay with an average bleed difference since first reactive bleed of 0.13. All bleeds found reactive in ACCESS HBsAg assay were confirmed in ACCESS HBsAg Confirmatory assay. Conclusion: The newly developed ACCESS HBsAg and ACCESS HBsAg Confirmatory assays from Beckman Coulter have demonstrated high clinical sensitivity and specificity, equivalent to currently marketed HBsAg assays, as well as a low false initial reactive rate. †Pending achievement of CE compliance; not yet available for in vitro diagnostic use. 2023-11317 Beckman Coulter and the Beckman Coulter product and service marks mentioned herein are trademarks or registered trademarks of Beckman Coulter, Inc. in the United States and other countries. All other trademarks are the property of their respective owners.

Keywords: dxi 9000 access immunoassay analyzer, hbsag, hbv, hepatitis b surface antigen, hepatitis b virus, immunoassay

Procedia PDF Downloads 72
4854 Failure Load Investigations in Adhesively Bonded Single-Strap Joints of Dissimilar Materials Using Cohesive Zone Model

Authors: B. Paygozar, S.A. Dizaji

Abstract:

Adhesive bonding is a highly valued type of fastening mechanical parts in complex structures, where joining some simple components is always needed. This method is of several merits, such as uniform stress distribution, appropriate bonding strength, and fatigue performance, and lightness, thereby outweighing other sorts of bonding methods. This study is to investigate the failure load of adhesive single-strap joints, including adherends of different sizes and materials. This kind of adhesive joint is very practical in different industries, especially when repairing the existing joints or attaching substrates of dissimilar materials. In this research, experimentally validated numerical analyses carried out in a commercial finite element package, ABAQUS, are utilized to extract the failure loads of the joints, based on the cohesive zone model. In addition, the stress analyses of the substrates are performed in order to acquire the effects of lowering the thickness of the substrates on the stress distribution inside them to avoid designs suffering from the necking or failure of the adherends. It was found out that this method of bonding is really feasible in joining dissimilar materials which can be utilized in a variety of applications. Moreover, the stress analyses indicated the minimum thickness for the adherends so as to avoid the failure of them.

Keywords: cohesive zone model, dissimilar materials, failure load, single strap joint

Procedia PDF Downloads 111
4853 A Proposed Optimized and Efficient Intrusion Detection System for Wireless Sensor Network

Authors: Abdulaziz Alsadhan, Naveed Khan

Abstract:

In recent years intrusions on computer network are the major security threat. Hence, it is important to impede such intrusions. The hindrance of such intrusions entirely relies on its detection, which is primary concern of any security tool like Intrusion Detection System (IDS). Therefore, it is imperative to accurately detect network attack. Numerous intrusion detection techniques are available but the main issue is their performance. The performance of IDS can be improved by increasing the accurate detection rate and reducing false positive. The existing intrusion detection techniques have the limitation of usage of raw data set for classification. The classifier may get jumble due to redundancy, which results incorrect classification. To minimize this problem, Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), and Local Binary Pattern (LBP) can be applied to transform raw features into principle features space and select the features based on their sensitivity. Eigen values can be used to determine the sensitivity. To further classify, the selected features greedy search, back elimination, and Particle Swarm Optimization (PSO) can be used to obtain a subset of features with optimal sensitivity and highest discriminatory power. These optimal feature subset used to perform classification. For classification purpose, Support Vector Machine (SVM) and Multilayer Perceptron (MLP) used due to its proven ability in classification. The Knowledge Discovery and Data mining (KDD’99) cup dataset was considered as a benchmark for evaluating security detection mechanisms. The proposed approach can provide an optimal intrusion detection mechanism that outperforms the existing approaches and has the capability to minimize the number of features and maximize the detection rates.

Keywords: Particle Swarm Optimization (PSO), Principle Component Analysis (PCA), Linear Discriminant Analysis (LDA), Local Binary Pattern (LBP), Support Vector Machine (SVM), Multilayer Perceptron (MLP)

Procedia PDF Downloads 352
4852 Obtaining Constants of Johnson-Cook Material Model Using a Combined Experimental, Numerical Simulation and Optimization Method

Authors: F. Rahimi Dehgolan, M. Behzadi, J. Fathi Sola

Abstract:

In this article, the Johnson-Cook material model’s constants for structural steel ST.37 have been determined by a method which integrates experimental tests, numerical simulation, and optimization. In the first step, a quasi-static test was carried out on a plain specimen. Next, the constants were calculated for it by minimizing the difference between the results acquired from the experiment and numerical simulation. Then, a quasi-static tension test was performed on three notched specimens with different notch radii. At last, in order to verify the results, they were used in numerical simulation of notched specimens and it was observed that experimental and simulation results are in good agreement. Changing the diameter size of the plain specimen in the necking area was set as the objective function in the optimization step. For final validation of the proposed method, diameter variation was considered as a parameter and its sensitivity to a change in any of the model constants was examined and the results were completely corroborating.

Keywords: constants, Johnson-Cook material model, notched specimens, quasi-static test, sensitivity

Procedia PDF Downloads 292
4851 Effect of Environmental Parameters on the Water Solubility of the Polycyclic Aromatic Hydrocarbons and Derivatives using Taguchi Experimental Design Methodology

Authors: Pranudda Pimsee, Caroline Sablayrolles, Pascale De Caro, Julien Guyomarch, Nicolas Lesage, Mireille Montréjaud-Vignoles

Abstract:

The MIGR’HYCAR research project was initiated to provide decisional tools for risks connected to oil spill drifts in continental waters. These tools aim to serve in the decision-making process once oil spill pollution occurs and/or as reference tools to study scenarios of potential impacts of pollutions on a given site. This paper focuses on the study of the distribution of polycyclic aromatic hydrocarbons (PAHs) and derivatives from oil spill in water as function of environmental parameters. Eight petroleum oils covering a representative range of commercially available products were tested. 41 Polycyclic Aromatic Hydrocarbons (PAHs) and derivate, among them 16 EPA priority pollutants were studied by dynamic tests at laboratory scale. The chemical profile of the water soluble fraction was different from the parent oil profile due to the various water solubility of oil components. Semi-volatile compounds (naphtalenes) constitute the major part of the water soluble fraction. A large variation in composition of the water soluble fraction was highlighted depending on oil type. Moreover, four environmental parameters (temperature, suspended solid quantity, salinity, and oil: water surface ratio) were investigated with the Taguchi experimental design methodology. The results showed that oils are divided into three groups: the solubility of Domestic fuel and Jet A1 presented a high sensitivity to parameters studied, meaning they must be taken into account. For gasoline (SP95-E10) and diesel fuel, a medium sensitivity to parameters was observed. In fact, the four others oils have shown low sensitivity to parameters studied. Finally, three parameters were found to be significant towards the water soluble fraction.

Keywords: mornitoring, PAHs, water soluble fraction, SBSE, Taguchi experimental design

Procedia PDF Downloads 309
4850 A Qualitative Review and Meta-Analyses of Published Literature Exploring Rates and Reasons Behind the Choice of Elective Caesarean Section in Pregnant Women With No Contraindication to Trial of Labor After One Previous Caesarean Section

Authors: Risheka Suthantirakumar, Eilish Pearson, Jacqueline Woodman

Abstract:

Background: Previous research has found a variety of rates and reasons for choosing medically unindicated elective repeat cesarean section (ERCS). Understanding the frequency and reasoning of ERCS, especially when unwarranted, could help healthcare professionals better tailor their advice and service. Therefore, our study conducted meta-analyses and qualitative analyses to identify the reasons and rates worldwide for choosing this procedure over the trial of labor after cesarean (TOLAC), also referred to in published literature as vaginal birth after cesarean (VBAC). Methods: We conducted a systematic review of published literature available on PubMed, EMBASE, and science.gov and conducted a blinded peer review process to assess eligibility. Search terms were created in collaboration with experts in the field. An inclusion and exclusion criteria were established prior to reviewing the articles. Included studies were limited to those published in English due to author constraints, although no international boundaries were used in the search. No time limit for the search was used in order to portray changes over time. Results: Our qualitative analyses found five consistent themes across international studies, which were socioeconomic and cultural differences, previous cesarean experience, perceptions of risk with vaginal birth, patients’ perceptions of future benefits, and medical advice and information. Our meta-analyses found variable rates of ERCS across international borders and within national populations. The average rate across all studies was 44% (CI 95% 36-51). Discussion: The studies included in our qualitative analysis demonstrated similar repetitive themes, which give validity to the findings across the studies included. We consider the rate variation across and within national populations to be partially a result of differing inclusion and eligibility assessment between different studies and argue that a proforma be utilized for future research to be comparable.

Keywords: elective cesarean section, VBAC, TOLAC, maternal choice

Procedia PDF Downloads 105
4849 Assessment of Seismic Behavior of Masonry Minarets by Discrete Element Method

Authors: Ozden Saygili, Eser Cakti

Abstract:

Mosques and minarets can be severely damaged as a result of earthquakes. Non-linear behavior of minarets of Mihrimah Sultan and Süleymaniye Mosques and the minaret of St. Sophia are analyzed to investigate seismic response, damage and failure mechanisms of minarets during earthquake. Selected minarets have different height and diameter. Discrete elements method was used to create the numerical minaret models. Analyses were performed using sine waves. Two parameters were used for evaluating the results: the maximum relative dislocation of adjacent drums and the maximum displacement at the top of the minaret. Both parameters were normalized by the drum diameter. The effects of minaret geometry on seismic behavior were evaluated by comparing the results of analyses.

Keywords: discrete element method, earthquake safety, nonlinear analysis, masonry structures

Procedia PDF Downloads 300
4848 Resilient Analysis as an Alternative to Conventional Seismic Analysis Methods for the Maintenance of a Socioeconomical Functionality of Structures

Authors: Sara Muhammad Elqudah, Vigh László Gergely

Abstract:

Catastrophic events, such as earthquakes, are sudden, short, and devastating, threatening lives, demolishing futures, and causing huge economic losses. Current seismic analyses and design standards are based on life safety levels where only some residual strength and stiffness are left in the structure leaving it beyond economical repair. Consequently, it has become necessary to introduce and implement the concept of resilient design. Resilient design is about designing for ductility over time by resisting, absorbing, and recovering from the effects of a hazard in an appropriate and efficient time manner while maintaining the functionality of the structure in the aftermath of the incident. Resilient analysis is mainly based on the fragility, vulnerability, and functionality curves where eventually a resilience index is generated from these curves, and the higher this index is, the better is the performance of the structure. In this paper, seismic performances of a simple two story reinforced concrete building, located in a moderate seismic region, has been evaluated using the conventional seismic analyses methods, which are the linear static analysis, the response spectrum analysis, and the pushover analysis, and the generated results of these analyses methods are compared to those of the resilient analysis. Results highlight that the resilience analysis was the most convenient method in generating a more ductile and functional structure from a socio-economic perspective, in comparison to the standard seismic analysis methods.

Keywords: conventional analysis methods, functionality, resilient analysis, seismic performance

Procedia PDF Downloads 92
4847 Assessment of Predictive Confounders for the Prevalence of Breast Cancer among Iraqi Population: A Retrospective Study from Baghdad, Iraq

Authors: Nadia H. Mohammed, Anmar Al-Taie, Fadia H. Al-Sultany

Abstract:

Although breast cancer prevalence continues to increase, mortality has been decreasing as a result of early detection and improvement in adjuvant systemic therapy. Nevertheless, this disease required further efforts to understand and identify the associated potential risk factors that could play a role in the prevalence of this malignancy among Iraqi women. The objective of this study was to assess the perception of certain predictive risk factors on the prevalence of breast cancer types among a sample of Iraqi women diagnosed with breast cancer. This was a retrospective observational study carried out at National Cancer Research Center in College of Medicine, Baghdad University from November 2017 to January 2018. Data of 100 patients with breast cancer whose biopsies examined in the National Cancer Research Center were included in this study. Data were collected to structure a detailed assessment regarding the patients’ demographic, medical and cancer records. The majority of study participants (94%) suffered from ductal breast cancer with mean age 49.57 years. Among those women, 48.9% were obese with body mass index (BMI) 35 kg/m2. 68.1% of them had positive family history of breast cancer and 66% had low parity. 40.4% had stage II ductal breast cancer followed by 25.5% with stage III. It was found that 59.6% and 68.1% had positive oestrogen receptor sensitivity and positive human epidermal growth factor (HER2/neu) receptor sensitivity respectively. In regard to the impact of prediction of certain variables on the incidence of ductal breast cancer, positive family history of breast cancer (P < 0.0001), low parity (P< 0.0001), stage I and II breast cancer (P = 0.02) and positive HER2/neu status (P < 0.0001) were significant predictive factors among the study participants. The results from this study provide relevant evidence for a significant positive and potential association between certain risk factors and the prevalence of breast cancer among Iraqi women.

Keywords: Ductal Breast Cancer, Hormone Sensitivity, Iraq, Risk Factors

Procedia PDF Downloads 121
4846 Design and Characterization of CMOS Readout Circuit for ISFET and ISE Based Sensors

Authors: Yuzman Yusoff, Siti Noor Harun, Noor Shelida Salleh, Tan Kong Yew

Abstract:

This paper presents the design and characterization of analog readout interface circuits for ion sensitive field effect transistor (ISFET) and ion selective electrode (ISE) based sensor. These interface circuits are implemented using MIMOS’s 0.35um CMOS technology and experimentally characterized under 24-leads QFN package. The characterization evaluates the circuit’s functionality, output sensitivity and output linearity. Commercial sensors for both ISFET and ISE are employed together with glass reference electrode during testing. The test result shows that the designed interface circuits manage to readout signals produced by both sensors with measured sensitivity of ISFET and ISE sensor are 54mV/pH and 62mV/decade, respectively. The characterized output linearity for both circuits achieves above 0.999 rsquare. The readout also has demonstrated reliable operation by passing all qualifications in reliability test plan.

Keywords: readout interface circuit (ROIC), analog interface circuit, ion sensitive field effect transistor (ISFET), ion selective electrode (ISE), ion sensor electronics

Procedia PDF Downloads 306
4845 The Influence of Bentonite on the Rheology of Geothermal Grouts

Authors: A. N. Ghafar, O. A. Chaudhari, W. Oettel, P. Fontana

Abstract:

This study is a part of the EU project GEOCOND-Advanced materials and processes to improve performance and cost-efficiency of shallow geothermal systems and underground thermal storage. In heat exchange boreholes, to improve the heat transfer between the pipes and the surrounding ground, the space between the pipes and the borehole wall is normally filled with geothermal grout. Traditionally, bentonite has been a crucial component in most commercially available geothermal grouts to assure the required stability and impermeability. The investigations conducted in the early stage of this project during the benchmarking tests on some commercial grouts showed considerable sensitivity of the rheological properties of the tested grouts to the mixing parameters, i.e., mixing time and velocity. Further studies on this matter showed that bentonite, which has been one of the important constituents in most grout mixes, was probably responsible for such behavior. Apparently, proper amount of shear should be applied during the mixing process to sufficiently activate the bentonite. The higher the amount of applied shear the more the activation of bentonite, resulting in change in the grout rheology. This explains why, occasionally in the field applications, the flow properties of the commercially available geothermal grouts using different mixing conditions (mixer type, mixing time, mixing velocity) are completely different than expected. A series of tests were conducted on the grout mixes, with and without bentonite, using different mixing protocols. The aim was to eliminate/reduce the sensitivity of the rheological properties of the geothermal grouts to the mixing parameters by replacing bentonite with polymeric (non-clay) stabilizers. The results showed that by replacing bentonite with a proper polymeric stabilizer, the sensitivity of the grout mix on mixing time and velocity was to a great extent diminished. This can be considered as an alternative for the developers/producers of geothermal grouts to provide enhanced materials with less uncertainty in obtained results in the field applications.

Keywords: flow properties, geothermal grout, mixing time, mixing velocity, rheological properties

Procedia PDF Downloads 113
4844 Comparison between Deterministic and Probabilistic Stability Analysis, Featuring Consequent Risk Assessment

Authors: Isabela Moreira Queiroz

Abstract:

Slope stability analyses are largely carried out by deterministic methods and evaluated through a single security factor. Although it is known that the geotechnical parameters can present great dispersal, such analyses are considered fixed and known. The probabilistic methods, in turn, incorporate the variability of input key parameters (random variables), resulting in a range of values of safety factors, thus enabling the determination of the probability of failure, which is an essential parameter in the calculation of the risk (probability multiplied by the consequence of the event). Among the probabilistic methods, there are three frequently used methods in geotechnical society: FOSM (First-Order, Second-Moment), Rosenblueth (Point Estimates) and Monte Carlo. This paper presents a comparison between the results from deterministic and probabilistic analyses (FOSM method, Monte Carlo and Rosenblueth) applied to a hypothetical slope. The end was held to evaluate the behavior of the slope and consequent risk analysis, which is used to calculate the risk and analyze their mitigation and control solutions. It can be observed that the results obtained by the three probabilistic methods were quite close. It should be noticed that the calculation of the risk makes it possible to list the priority to the implementation of mitigation measures. Therefore, it is recommended to do a good assessment of the geological-geotechnical model incorporating the uncertainty in viability, design, construction, operation and closure by means of risk management. 

Keywords: probabilistic methods, risk assessment, risk management, slope stability

Procedia PDF Downloads 373
4843 Assessing an Instrument Usability: Response Interpolation and Scale Sensitivity

Authors: Betsy Ng, Seng Chee Tan, Choon Lang Quek, Peter Looker, Jaime Koh

Abstract:

The purpose of the present study was to determine the particular scale rating that stands out for an instrument. The instrument was designed to assess student perceptions of various learning environments, namely face-to-face, online and blended. The original instrument had a 5-point Likert items (1 = strongly disagree and 5 = strongly agree). Alternate versions were modified with a 6-point Likert scale and a bar scale rating. Participants consisted of undergraduates in a local university were involved in the usability testing of the instrument in an electronic setting. They were presented with the 5-point, 6-point and percentage-bar (100-point) scale ratings, in response to their perceptions of learning environments. The 5-point and 6-point Likert scales were presented in the form of radio button controls for each number, while the percentage-bar scale was presented with a sliding selection. Among these responses, 6-point Likert scale emerged to be the best overall. When participants were confronted with the 5-point items, they either chose 3 or 4, suggesting that data loss could occur due to the insensitivity of instrument. The insensitivity of instrument could be due to the discreet options, as evidenced by response interpolation. To avoid the constraint of discreet options, the percentage-bar scale rating was tested, but the participant responses were not well-interpolated. The bar scale might have provided a variety of responses without a constraint of a set of categorical options, but it seemed to reflect a lack of perceived and objective accuracy. The 6-point Likert scale was more likely to reflect a respondent’s perceived and objective accuracy as well as higher sensitivity. This finding supported the conclusion that 6-point Likert items provided a more accurate measure of the participant’s evaluation. The 5-point and bar scale ratings might not be accurately measuring the participants’ responses. This study highlighted the importance of the respondent’s perception of accuracy, respondent’s true evaluation, and the scale’s ease of use. Implications and limitations of this study were also discussed.

Keywords: usability, interpolation, sensitivity, Likert scales, accuracy

Procedia PDF Downloads 399
4842 Evaluation of Two DNA Extraction Methods for Minimal Porcine (Pork) Detection in Halal Food Sample Mixture Using Taqman Real-time PCR Technique

Authors: Duaa Mughal, Syeda Areeba Nadeem, Shakil Ahmed, Ishtiaq Ahmed Khan

Abstract:

The identification of porcine DNA in Halal food items is critical to ensuring compliance with dietary restrictions and religious beliefs. In Islam, Porcine is prohibited as clearly mentioned in Quran (Surah Al-Baqrah, Ayat 173). The purpose of this study was to compare two DNA extraction procedures for detecting 0.001% of porcine DNA in processed Halal food sample mixtures containing chicken, camel, veal, turkey and goat meat using the TaqMan Real-Time PCR technology. In this research, two different commercial kit protocols were compared. The processed sample mixtures were prepared by spiking known concentration of porcine DNA to non-porcine food matrices. Afterwards, TaqMan Real-Time PCR technique was used to target a particular porcine gene from the extracted DNA samples, which was quantified after extraction. The results of the amplification were evaluated for sensitivity, specificity, and reproducibility. The results of the study demonstrated that two DNA extraction techniques can detect 0.01% of porcine DNA in mixture of Halal food samples. However, as compared to the alternative approach, Eurofins| GeneScan GeneSpin DNA Isolation kit showed more effective sensitivity and specificity. Furthermore, the commercial kit-based approach showed great repeatability with minimal variance across repeats. Quantification of DNA was done by using fluorometric assay. In conclusion, the comparison of DNA extraction methods for detecting porcine DNA in Halal food sample mixes using the TaqMan Real-Time PCR technology reveals that the commercial kit-based approach outperforms the other methods in terms of sensitivity, specificity, and repeatability. This research helps to promote the development of reliable and standardized techniques for detecting porcine DNA in Halal food items, religious conformity and assuring nutritional.

Keywords: real time PCR (qPCR), DNA extraction, porcine DNA, halal food authentication, religious conformity

Procedia PDF Downloads 60
4841 Suitability Evaluation of Human Settlements Using a Global Sensitivity Analysis Method: A Case Study in of China

Authors: Feifei Wu, Pius Babuna, Xiaohua Yang

Abstract:

The suitability evaluation of human settlements over time and space is essential to track potential challenges towards suitable human settlements and provide references for policy-makers. This study established a theoretical framework of human settlements based on the nature, human, economy, society and residence subsystems. Evaluation indicators were determined with the consideration of the coupling effect among subsystems. Based on the extended Fourier amplitude sensitivity test algorithm, the global sensitivity analysis that considered the coupling effect among indicators was used to determine the weights of indicators. The human settlement suitability was evaluated at both subsystems and comprehensive system levels in 30 provinces of China between 2000 and 2016. The findings were as follows: (1) human settlements suitability index (HSSI) values increased significantly in all 30 provinces from 2000 to 2016. Among the five subsystems, the suitability index of the residence subsystem in China exhibited the fastest growinggrowth, fol-lowed by the society and economy subsystems. (2) HSSI in eastern provinces with a developed economy was higher than that in western provinces with an underdeveloped economy. In con-trast, the growing rate of HSSI in eastern provinces was significantly higher than that in western provinces. (3) The inter-provincial difference of in HSSI decreased from 2000 to 2016. For sub-systems, it decreased for the residence system, whereas it increased for the economy system. (4) The suitability of the natural subsystem has become a limiting factor for the improvement of human settlements suitability, especially in economically developed provinces such as Beijing, Shanghai, and Guangdong. The results can be helpful to support decision-making and policy for improving the quality of human settlements in a broad nature, human, economy, society and residence context.

Keywords: human settlements, suitability evaluation, extended fourier amplitude, human settlement suitability

Procedia PDF Downloads 68
4840 Development of a Highly Flexible, Sensitive and Stretchable Polymer Nanocomposite for Strain Sensing

Authors: Shaghayegh Shajari, Mehdi Mahmoodi, Mahmood Rajabian, Uttandaraman Sundararaj, Les J. Sudak

Abstract:

Although several strain sensors based on carbon nanotubes (CNTs) have been reported, the stretchability and sensitivity of these sensors have remained as a challenge. Highly stretchable and sensitive strain sensors are in great demand for human motion monitoring and human-machine interface. This paper reports the fabrication and characterization of a new type of strain sensors based on a stretchable fluoropolymer / CNT nanocomposite system made via melt-mixing technique. Electrical and mechanical characterizations were obtained. The results showed that this nanocomposite sensor has high stretchability up to 280% of strain at an optimum level of filler concentration. The piezoresistive properties and the strain sensing mechanism of the strain sensor were investigated using Electrochemical Impedance Spectroscopy (EIS). High sensitivity was obtained (gauge factor as large as 12000 under 120% applied strain) in particular at the concentrations above the percolation threshold. Due to the tunneling effect, a non- linear piezoresistivity was observed at high concentrations of CNT loading. The nanocomposites with good conductivity and lightweight could be a promising candidate for strain sensing applications.

Keywords: carbon nanotubes, fluoropolymer, piezoresistive, strain sensor

Procedia PDF Downloads 285
4839 Cardiothoracic Ratio in Postmortem Computed Tomography: A Tool for the Diagnosis of Cardiomegaly

Authors: Alex Eldo Simon, Abhishek Yadav

Abstract:

This study aimed to evaluate the utility of postmortem computed tomography (CT) and heart weight measurements in the assessment of cardiomegaly in cases of sudden death due to cardiac origin by comparing the results of these two diagnostic methods. The study retrospectively analyzed postmortem computed tomography (PMCT) data from 54 cases of sudden natural death and compared the findings with those of the autopsy. The study involved measuring the cardiothoracic ratio (CTR) from coronal computed tomography (CT) images and determining the actual cardiac weight by weighing the heart during the autopsy. The inclusion criteria for the study were cases of sudden death suspected to be caused by cardiac pathology, while exclusion criteria included death due to unnatural causes such as trauma or poisoning, diagnosed natural causes of death related to organs other than the heart, and cases of decomposition. Sensitivity, specificity, and diagnostic accuracy were calculated, and to evaluate the accuracy of using the cardiothoracic ratio (CTR) to detect an enlarged heart, the study generated receiver operating characteristic (ROC) curves. The cardiothoracic ratio (CTR) is a radiological tool used to assess cardiomegaly by measuring the maximum cardiac diameter in relation to the maximum transverse diameter of the chest wall. The clinically used criteria for CTR have been modified from 0.50 to 0.57 for use in postmortem settings, where abnormalities can be detected by comparing CTR values to this threshold. A CTR value of 0.57 or higher is suggestive of hypertrophy but not conclusive. Similarly, heart weight is measured during the traditional autopsy, and a cardiac weight greater than 450 grams is defined as hypertrophy. Of the 54 cases evaluated, 22 (40.7%) had a cardiothoracic ratio (CTR) ranging from > 0.50 to equal 0.57, and 12 cases (22.2%) had a CTR greater than 0.57, which was defined as hypertrophy. The mean CTR was calculated as 0.52 ± 0.06. Among the 54 cases evaluated, the weight of the heart was measured, and the mean was calculated as 369.4 ± 99.9 grams. Out of the 54 cases evaluated, 12 were found to have hypertrophy as defined by PMCT, while only 9 cases were identified with hypertrophy in traditional autopsy. The sensitivity and specificity of the test were calculated as 55.56% and 84.44%, respectively. The sensitivity of the hypertrophy test was found to be 55.56% (95% CI: 26.66, 81.12¹), the specificity was 84.44% (95% CI: 71.22, 92.25¹), and the diagnostic accuracy was 79.63% (95% CI: 67.1, 88.23¹). The limitation of the study was a low sample size of only 54 cases, which may limit the generalizability of the findings. The comparison of the cardiothoracic ratio with heart weight in this study suggests that PMCT may serve as a screening tool for medico-legal autopsies when performed by forensic pathologists. However, it should be noted that the low sensitivity of the test (55.5%) may limit its diagnostic accuracy, and therefore, further studies with larger sample sizes and more diverse populations are needed to validate these findings.

Keywords: PMCT, virtopsy, CTR, cardiothoracic ratio

Procedia PDF Downloads 71
4838 Treating Voxels as Words: Word-to-Vector Methods for fMRI Meta-Analyses

Authors: Matthew Baucum

Abstract:

With the increasing popularity of fMRI as an experimental method, psychology and neuroscience can greatly benefit from advanced techniques for summarizing and synthesizing large amounts of data from brain imaging studies. One promising avenue is automated meta-analyses, in which natural language processing methods are used to identify the brain regions consistently associated with certain semantic concepts (e.g. “social”, “reward’) across large corpora of studies. This study builds on this approach by demonstrating how, in fMRI meta-analyses, individual voxels can be treated as vectors in a semantic space and evaluated for their “proximity” to terms of interest. In this technique, a low-dimensional semantic space is built from brain imaging study texts, allowing words in each text to be represented as vectors (where words that frequently appear together are near each other in the semantic space). Consequently, each voxel in a brain mask can be represented as a normalized vector sum of all of the words in the studies that showed activation in that voxel. The entire brain mask can then be visualized in terms of each voxel’s proximity to a given term of interest (e.g., “vision”, “decision making”) or collection of terms (e.g., “theory of mind”, “social”, “agent”), as measured by the cosine similarity between the voxel’s vector and the term vector (or the average of multiple term vectors). Analysis can also proceed in the opposite direction, allowing word cloud visualizations of the nearest semantic neighbors for a given brain region. This approach allows for continuous, fine-grained metrics of voxel-term associations, and relies on state-of-the-art “open vocabulary” methods that go beyond mere word-counts. An analysis of over 11,000 neuroimaging studies from an existing meta-analytic fMRI database demonstrates that this technique can be used to recover known neural bases for multiple psychological functions, suggesting this method’s utility for efficient, high-level meta-analyses of localized brain function. While automated text analytic methods are no replacement for deliberate, manual meta-analyses, they seem to show promise for the efficient aggregation of large bodies of scientific knowledge, at least on a relatively general level.

Keywords: FMRI, machine learning, meta-analysis, text analysis

Procedia PDF Downloads 435
4837 Uncovering the Relationship between EFL Students' Self-Concept and Their Willingness to Communicate in Language Classes

Authors: Seyedeh Khadijeh Amirian, Seyed Mohammad Reza Amirian, Narges Hekmati

Abstract:

The current study aims at examining the relationship between English as a foreign language (EFL) students' self-concept and their willingness to communicate (WTC) in EFL classes. To this effect, two questionnaires, namely 'Willingness to Communicate' (MacIntyre et al., 2001) and 'Self-Concept Scale' (Liu and Wang, 2005), were distributed among 174 (45 males and 129 females) Iranian EFL university students. Correlation and regression analyses were conducted to examine the relationship between the two variables. The results indicated that there was a significantly positive correlation between EFL students' self-concept and their WTC in EFL classes (p < .0.05). Moreover, regression analyses indicated that self-concept has a significantly positive influence on students’ WTC in language classes (B= .302, p < .0.05) and explains .302 percent of the variance in the dependent variable (WTC). The results are discussed with regards to the individual differences in educational contexts, and implications are offered.

Keywords: EFL students, language classes, willingness to communicate, self-concept

Procedia PDF Downloads 111
4836 Feasibility of Battery Electric Vehicles in Saudi Arabia: Cost and Sensitivity Analysis

Authors: Tawfiq Albishri, Abdulmajeed Alqahtani

Abstract:

Battery electric vehicles (BEVs) are increasingly seen as a sustainable alternative to internal combustion engine (ICE) vehicles, primarily due to their environmental and economic benefits. Saudi Arabia's interest in investing in renewable energy and reducing greenhouse gas emissions presents significant potential for the widespread adoption of BEVs in the country. However, several factors have hindered the adoption of BEVs in Saudi Arabia, with high ownership costs being the most prominent barrier. This cost discrepancy is primarily due to the lack of localized production of BEVs and their components, leading to increased import costs, as well as the high initial cost of BEVs compared to ICE vehicles. This paper aims to evaluate the feasibility of BEVs compared to ICE vehicles in Saudi Arabia by conducting a cost of ownership analysis. Furthermore, a sensitivity analysis will be conducted to determine the most significant contributor to the ownership costs of BEVs that, if changed, could expedite their adoption in Saudi Arabia.

Keywords: battery electric vehicles, internal combustion engine, renewable energy, greenhouse gas emissions, total cost of ownership

Procedia PDF Downloads 72
4835 Short and Long Term Effects of an Attachment-Based Intervention on Child Behaviors

Authors: Claire Baudry, Jessica Pearson, Laura-Emilie Savage, George Tarbulsy

Abstract:

Over the last fifty years, maternal sensitivity and child development among vulnerable families have been a priority for researchers. For this reason, attachment-based interventions have been implemented and been shown to be effective in enhancing child development. Most of the time, child outcomes are measured shortly after the intervention. Objectives: The goal of the study was to investigate the effects of an attachment-based intervention on child development shortly after the intervention ended and one-year post-intervention. Methods: Over the seventy-two mother-child dyads referred by Child Protective Services in the province of Québec, Canada, forty-two were included in this study: 24 dyads who received 6 to 8 intervention sessions and 18 dyads who did not. Intervention and none intervention dyads were matched for the following variables: duration of child protective services, the reason for involvement with child protection, age, sex, and family status. Internalizing and externalizing behaviors were measured 3 and 12 months after the end of the intervention when the average age of children were respectively 45 and 54 months old. Findings: Independent-sample t-tests were conducted to compare scores between the two groups and the two data collection times. In general, on differences observed between the two groups three months after the intervention ended, just a few of them were still present nine months later. Conclusions: This first set of analyses suggests that the effects of attachment-based intervention observed three months following the intervention are not lasting for most of them. Those results inform us of the importance of considering the possibility to offer more attachment-based intervention sessions for those highly vulnerable families.

Keywords: attachment-based intervention, child behaviors, child protective services, highly vulnerable families

Procedia PDF Downloads 126
4834 Prevalence and Antibiotic Susceptibility of Bacterial Isolates from Mastitis Milk of Cow and Buffalo in Udaipur, India

Authors: Hardik Goswami, Gayatri Swarnakar

Abstract:

-Mastitis disease has been known as one of the most costly diseases of dairy cattle and observed as an inflammatory disease of cow and buffalo udder. Mastitis badly affected animal health, quality of milk and economics of milk production along with cause’s great economic loss. Bacteria have been representing the most common etiological agents of mastitis. The antibiotic sensitivity test was important to attain accurate treatment of mastitis. The aim of present research work was to explore prevalence and antibiotic susceptibility pattern of bacterial isolates recovered from cow and buffalo clinical mastitis milk sample. During the period of April 2010 to April 2014, total 1487 clinical mastitis milk samples of cow and buffalo were tested to check the prevalence of mastitis causing bacterial isolates. Milk samples were collected aseptically from the udder at the time of morning milking. The most prevalent bacterial isolates were Staphylococcus aureus (24.34%) followed by coliform bacteria (15.87%), coagulase negative Staphylococcus aureus (13.85%), non-coliform bacteria (13.05%), mixed infection (12.51%), Streptococcus spp. (10.96%). Out of 1487, 140 (9.42%) mastitis milk samples showed no growth on culture media. Identification of bacteria made on the basis of Standard Microbial features and procedures. Antibiotic susceptibility of bacterial isolates was investigated by Kirby-Bauer disk diffusion method. In vitro Antibiotic susceptibility test of bacterial isolates revealed higher sensitivity to Gentamicin (74.6%), Ciprofloxacin (62.1%) and Amikacin (59.4%). The lower susceptibility was shown to Amoxicillin (21.6%), Erythromycin (26.4%) and Ceftizoxime (29.9%). Antibiotic sensitivity pattern revealed Gentamicin are the possible effective antibiotic against the major prevalent mastitis pathogens. Present research work would be helpful in increase production, quality and quantity of milk, increase annual income of dairy owners and improve health of cow and buffaloes.

Keywords: antibiotic, buffalo, cow, mastitis, prevalence

Procedia PDF Downloads 389
4833 Determination of Nanomolar Mercury (II) by Using Multi-Walled Carbon Nanotubes Modified Carbon Zinc/Aluminum Layered Double Hydroxide – 3 (4-Methoxyphenyl) Propionate Nanocomposite Paste Electrode

Authors: Illyas Md Isa, Sharifah Norain Mohd Sharif, Norhayati Hashima

Abstract:

A mercury(II) sensor was developed by using multi-walled carbon nanotubes (MWCNTs) paste electrode modified with Zn/Al layered double hydroxide-3(4-methoxyphenyl)propionate nanocomposite (Zn/Al-HMPP). The optimum conditions by cyclic voltammetry were observed at electrode composition 2.5% (w/w) of Zn/Al-HMPP/MWCNTs, 0.4 M potassium chloride, pH 4.0, and scan rate of 100 mVs-1. The sensor exhibited wide linear range from 1x10-3 M to 1x10-7 M Hg2+ and 1x10-7 M to 1x10-9 M Hg2+, with a detection limit of 1x10-10 M Hg2+. The high sensitivity of the proposed electrode towards Hg(II) was confirmed by double potential-step chronocoulometry which indicated these values; diffusion coefficient 1.5445 x 10-9 cm2 s-1, surface charge 524.5 µC s-½ and surface coverage 4.41 x 10-2 mol cm-2. The presence of 25-fold concentration of most metal ions had no influence on the anodic peak current. With characteristics such as high sensitivity, selectivity and repeatability the electrode was then proposed as the appropriate alternative for the determination of mercury(II).

Keywords: cyclic voltammetry, mercury(II), modified carbon paste electrode, nanocomposite

Procedia PDF Downloads 298
4832 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 264
4831 Frequency of Hepatitis C Virus in Diagnosed Tuberculosis Cases

Authors: Muhammad Farooq Baig, Saleem Qadeer

Abstract:

Background: The frequency of hepatitis C virus infection along with tuberculosis has not been widely investigated and very low statistics on rates of hepatitis C virus co-infection in tuberculosis patients. Hepatotoxicity is the major side effect of anti-tuberculosis therapy hepatitis HCVliver disease elevates the chances of hepatotoxicity up-to five folds. Objectives & Aim: To see the frequency of Hepatitis Cvirus infection amongst people with diagnosed Tuberculosis using gene X-pert technique. To evaluate the factors associated with HCVinfection in patients with MTBtuberculosis and to determine sensitivity and specificity of the tests. Study design: Comparative analytical study. Methodology: Three hundred and thirteen patients of tuberculosis diagnosed by Genexpert included while testing hepatitis C virus using immunochromotography rapid test technique, enzyme linked immunosorbent assay method and polymerase chain reaction test for confirmation. Results:Higher frequency of tuberculosis infection in males 57.8%, 42.5% between 20-39 years and 22% of hepatitis C virus infection in tuberculosis patients.The sensitivity of rapid test and enzyme-linked immunosorbent assay was 79% and 96% respectively while the specificity of rapid test and enzyme-linked immunosorbent assay was 91% and 99% respectively.

Keywords: Mycobactrium Tuberculosis, PC'R, Gene x pert, Hepatitis C virus

Procedia PDF Downloads 58
4830 Real-Time Quantitative Polymerase Chain Reaction Assay for the Detection of microRNAs Using Bi-Directional Extension Sequences

Authors: Kyung Jin Kim, Jiwon Kwak, Jae-Hoon Lee, Soo Suk Lee

Abstract:

MicroRNAs (miRNA) are a class of endogenous, single-stranded, small, and non-protein coding RNA molecules typically 20-25 nucleotides long. They are thought to regulate the expression of other genes in a broad range by binding to 3’- untranslated regions (3’-UTRs) of specific mRNAs. The detection of miRNAs is very important for understanding of the function of these molecules and in the diagnosis of variety of human diseases. However, detection of miRNAs is very challenging because of their short length and high sequence similarities within miRNA families. So, a simple-to-use, low-cost, and highly sensitive method for the detection of miRNAs is desirable. In this study, we demonstrate a novel bi-directional extension (BDE) assay. In the first step, a specific linear RT primer is hybridized to 6-10 base pairs from the 3’-end of a target miRNA molecule and then reverse transcribed to generate a cDNA strand. After reverse transcription, the cDNA was hybridized to the 3’-end which is BDE sequence; it played role as the PCR template. The PCR template was amplified in an SYBR green-based quantitative real-time PCR. To prove the concept, we used human brain total RNA. It could be detected quantitatively in the range of seven orders of magnitude with excellent linearity and reproducibility. To evaluate the performance of BDE assay, we contrasted sensitivity and specificity of the BDE assay against a commercially available poly (A) tailing method using miRNAs for let-7e extracted from A549 human epithelial lung cancer cells. The BDE assay displayed good performance compared with a poly (A) tailing method in terms of specificity and sensitivity; the CT values differed by 2.5 and the melting curve showed a sharper than poly (A) tailing methods. We have demonstrated an innovative, cost-effective BDE assay that allows improved sensitivity and specificity in detection of miRNAs. Dynamic range of the SYBR green-based RT-qPCR for miR-145 could be represented quantitatively over a range of 7 orders of magnitude from 0.1 pg to 1.0 μg of human brain total RNA. Finally, the BDE assay for detection of miRNA species such as let-7e shows good performance compared with a poly (A) tailing method in terms of specificity and sensitivity. Thus BDE proves a simple, low cost, and highly sensitive assay for various miRNAs and should provide significant contributions in research on miRNA biology and application of disease diagnostics with miRNAs as targets.

Keywords: bi-directional extension (BDE), microRNA (miRNA), poly (A) tailing assay, reverse transcription, RT-qPCR

Procedia PDF Downloads 155
4829 Modeling Heat-Related Mortality Based on Greenhouse Emissions in OECD Countries

Authors: Anderson Ngowa Chembe, John Olukuru

Abstract:

Greenhouse emissions by human activities are known to irreversibly increase global temperatures through the greenhouse effect. This study seeks to propose a mortality model with sensitivity to heat-change effects as one of the underlying parameters in the model. As such, the study sought to establish the relationship between greenhouse emissions and mortality indices in five OECD countries (USA, UK, Japan, Canada & Germany). Upon the establishment of the relationship using correlation analysis, an additional parameter that accounts for the sensitivity of heat-changes to mortality rates was incorporated in the Lee-Carter model. Based on the proposed model, new parameter estimates were calculated using iterative algorithms for optimization. Finally, the goodness of fit for the original Lee-Carter model and the proposed model were compared using deviance comparison. The proposed model provides a better fit to mortality rates especially in USA, UK and Germany where the mortality indices have a strong positive correlation with the level of greenhouse emissions. The results of this study are of particular importance to actuaries, demographers and climate-risk experts who seek to use better mortality-modeling techniques in the wake of heat effects caused by increased greenhouse emissions.

Keywords: climate risk, greenhouse emissions, Lee-Carter model, OECD

Procedia PDF Downloads 330
4828 Proposed Model to Assess E-Government Readiness in Jordan

Authors: Hadeel Abdulatif, Maha Alkhaffaf

Abstract:

E-government is the use of Information and Communication Technology to enrich the access to and delivery of government services to citizens, business partners and employees, Policy makers and regulatory bodies have to be cognizant of the degree of readiness of a populace in order to design and implement efficient e-government programs. This paper aims to provide a transparent situation analyses for the case of e-government official website in Jordan, it focuses on assessing e-government in Jordan; web site assessment by using international criteria for assessing e-government websites, However, the study analyses the environmental factor consisting of cultural and business environment factors. By reviewing the literature the researchers found that government's efforts towards e-government may vary according to the country's readiness and other key implementation factors which will lead to diverse e-government experience; thus, there is a need to study the impact of key factors to implement e-government in Jordan.

Keywords: e-government, environmental factors, website assessment, readiness

Procedia PDF Downloads 283
4827 Disclosure on Adherence of the King Code's Audit Committee Guidance: Cluster Analyses to Determine Strengths and Weaknesses

Authors: Philna Coetzee, Clara Msiza

Abstract:

In modern society, audit committees are seen as the custodians of accountability and the conscience of management and the board. But who holds the audit committee accountable for their actions or non-actions and how do we know what they are supposed to be doing and what they are doing? The purpose of this article is to provide greater insight into the latter part of this problem, namely, determine what best practises for audit committees and the disclosure of what is the realities are. In countries where governance is well established, the roles and responsibilities of the audit committee are mostly clearly guided by legislation and/or guidance documents, with countries increasingly providing guidance on this topic. With high cost involved to adhere to governance guidelines, the public (for public organisations) and shareholders (for private organisations) expect to see the value of their ‘investment’. For audit committees, the dividends on the investment should reflect in less fraudulent activities, less corruption, higher efficiency and effectiveness, improved social and environmental impact, and increased profits, to name a few. If this is not the case (which is reflected in the number of fraudulent activities in both the private and the public sector), stakeholders have the right to ask: where was the audit committee? Therefore, the objective of this article is to contribute to the body of knowledge by comparing the adherence of audit committee to best practices guidelines as stipulated in the King Report across public listed companies, national and provincial government departments, state-owned enterprises and local municipalities. After constructs were formed, based on the literature, factor analyses were conducted to reduce the number of variables in each construct. Thereafter, cluster analyses, which is an explorative analysis technique that classifies a set of objects in such a way that objects that are more similar are grouped into the same group, were conducted. The SPSS TwoStep Clustering Component was used, being capable of handling both continuous and categorical variables. In the first step, a pre-clustering procedure clusters the objects into small sub-clusters, after which it clusters these sub-clusters into the desired number of clusters. The cluster analyses were conducted for each construct and the measure, namely the audit opinion as listed in the external audit report, were included. Analysing 228 organisations' information, the results indicate that there is a clear distinction between the four spheres of business that has been included in the analyses, indicating certain strengths and certain weaknesses within each sphere. The results may provide the overseers of audit committees’ insight into where a specific sector’s strengths and weaknesses lie. Audit committee chairs will be able to improve the areas where their audit committee is lacking behind. The strengthening of audit committees should result in an improvement of the accountability of boards, leading to less fraud and corruption.

Keywords: audit committee disclosure, cluster analyses, governance best practices, strengths and weaknesses

Procedia PDF Downloads 150
4826 A Novel Nano-Chip Card Assay as Rapid Test for Diagnosis of Lymphatic Filariasis Compared to Nano-Based Enzyme Linked Immunosorbent Assay

Authors: Ibrahim Aly, Manal Ahmed, Mahmoud M. El-Shall

Abstract:

Filariasis is a parasitic disease caused by small roundworms. The filarial worms are transmitted and spread by blood-feeding black flies and mosquitoes. Lymphatic filariasis (Elephantiasis) is caused by Wuchereriabancrofti, Brugiamalayi, and Brugiatimori. Elimination of Lymphatic filariasis necessitates an increasing demand for valid, reliable, and rapid diagnostic kits. Nanodiagnostics involve the use of nanotechnology in clinical diagnosis to meet the demands for increased sensitivity, specificity, and early detection in less time. The aim of this study was to evaluate the nano-based enzymelinked immunosorbent assay (ELISA) and novel nano-chip card as a rapid test for detection of filarial antigen in serum samples of human filariasis in comparison with traditional -ELISA. Serum samples were collected from an infected human with filarial gathered across Egypt's governorates. After receiving informed consenta total of 45 blood samples of infected individuals residing in different villages in Gharbea governorate, which isa nonendemic region for bancroftianfilariasis, healthy persons living in nonendemic locations (20 persons), as well as sera from 20 other parasites, affected patients were collected. The microfilaria was checked in thick smears of 20 µl night blood samples collected during 20-22 hrs. All of these individuals underwent the following procedures: history taking, clinical examination, and laboratory investigations, which included examination of blood samples for microfilaria using thick blood film and serological tests for detection of the circulating filarial antigen using polyclonal antibody- ELISA, nano-based ELISA, and nano-chip card. In the present study, a recently reported polyoclonal antibody specific to tegumental filarial antigen was used in developing nano-chip card and nano-ELISA compared to traditional ELISA for the detection of circulating filarial antigen in sera of patients with bancroftianfilariasis. The performance of the ELISA was evaluated using 45 serum samples. The ELISA was positive with sera from microfilaremicbancroftianfilariasis patients (n = 36) with a sensitivity of 80 %. Circulating filarial antigen was detected in 39/45 patients who were positive for circulating filarial antigen using nano-ELISA with a sensitivity of 86.6 %. On the other hand, 42 out of 45 patients were positive for circulating filarial antigen using nano-chip card with a sensitivity of 93.3%.In conclusion, using a novel nano-chip assay could potentially be a promising alternative antigen detection test for bancroftianfilariasis.

Keywords: lymphatic filariasis, nanotechnology, rapid diagnosis, elisa technique

Procedia PDF Downloads 104