Search results for: sensory processing sensitivity
1794 Bacteriological Safety of Sachet Drinking Water Sold in Benin City, Nigeria
Authors: Stephen Olusanmi Akintayo
Abstract:
Access to safe drinking water remains a major challenge in Nigeria, and where available, the quality of the water is often in doubt. An alternative to the inadequate clean drinking water is being found in treated drinking water packaged in electrically heated sealed nylon and commonly referred to as “sachet water”. “Sachet water” is a common thing in Nigeria as the selling price is within the reach of members of the low socio- economic class and the setting up of a production unit does not require huge capital input. The bacteriological quality of selected “sachet water” stored at room temperature over a period of 56 days was determined to evaluate the safety of the sachet drinking water. Test for the detection of coliform bacteria was performed, and the result showed no coliform bacteria that indicates the absence of fecal contamination throughout 56 days. Heterotrophic plate count (HPC) was done at an interval 14 days, and the samples showed HPC between 0 cfu/mL and 64 cfu/mL. The highest count was observed on day 1. The count decreased between day 1 and 28, while no growths were observed between day 42 and 56. The decrease in HPC suggested the presence of residual disinfectant in the water. The organisms isolated were identified as Staphylococcus epidermis and S. aureus. The presence of these microorganisms in sachet water is indicative for contamination during processing and handling.Keywords: coliform, heterotrophic plate count, sachet water, Staphyloccocus aureus, Staphyloccocus epidermidis
Procedia PDF Downloads 3411793 Carthage-Burned and Rome-Reiterative: Mirrored Distortions of Imperial Trauma and Historiography
Authors: Sarah H. Davies
Abstract:
In the year 146 BCE, the Roman general Scipio Aemilianus – soon to be ‘anointed,’ via mass-spilling of blood-on-land, as “(‘triumphal’) Africanus” – stood atop a hill, overlooking the city of Carthage, as its urban-scape was burned and people killed, violated, captured… ‘poetically’ consumed. From an ineffable-seeming distance – constructed, in imperial fascination – the scene was (and is, apparently) painted in a disturbingly ‘romantic’ light. Such a snap-shot vista, projected before a mind’s-eye in panorama, and in (ongoing) construction, has seeped across ancient and modern lines, with multiple, interwoven iterations. This study conducts a reading, both ‘postcolonial’ and anti-imperial, in interruption of an ongoing (re)iteration of imperial violence, mirrored in distortion between “ancient” and “modern” forms that are physical, ideological, and ontological. Using an analysis of ancient literary works, from the historiographical (Polybius’ Histories) to the epic-poetic (Vergil’s Aeneid), placed in juxtaposition with a range of modern material, both literary-historical (e.g., Gibbon’s Decline & Fall of the Roman Empire) and visual (Cole’s The Course of Empire), this study destabilizes ongoing formations. Such formations attempt to inflict ‘an assumed’ repetition, engaged in normalizing a city violently destroyed as somehow ‘natural’ and/or ‘inevitable,’ and by extension, ‘tragically necessary.’ The reiterations – across media and contexts – create a distorted aesthetic (itself an act of profound violence) that fetishizes and even produces sensory, illusory pleasures (of co-complicit harm, within and across communities) regarding ‘period-shifting events’ of mass-murder and cultural erasure. ‘The vista over Carthage burning’ was/is (but does not ever have to be) thereby a manufactured stage-set, a commodity for imperial reproduction. Such a projection frames an overly-simplistic, ‘safe’-seeming (and yet incredibly dangerous) binary regarding (caricatured) “victims” and “victors.” At the same time, the projection renders an epistemological frame whereby ‘The One’ and ‘The Other’ are asserted as inherently antagonistic categories of being, in which One ‘must’ replace Other – the latter portrayed in gendered, exoticized, and time-distorted ways, as a scripted-object. All the while, a very particular subset of narrative is woven, whereby Carthage (elided in ‘victim’ status) specifically is/was Troy (again, elided), is/was every ‘destroyed city’ (also elided), and is/was yet another essential marking-point of “History,” twisted into ‘becoming’ a ‘reset’ point in a ‘cyclical pattern,’ inscribed as a tragic plot or lifetime repeated. The script itself entails pervasive violence. And yet, there always remains a trip-wire written into the constructed-cyclical. In part, this realization comes from a deconstruction of the tiered violences of an over-worn trope. The realization then also comes from a revelation of erased realities of human-experiences, in which ‘victim’ and ‘victor’ suffer, in fractured differences of ongoing, system(at)ic (re)trauma. The contours and silences of the historical records contain all the ongoing scars. This study therefore unravels the intersectional tableaux of ‘Carthage-burning’ and ‘Rome-reiterative,’ providing a collective investigation into conceptual formations, fractured across millennia. Ultimately, perhaps, such a re-reading – occurring via a commodified past will echo words from the Aeneid: “perhaps, once upon a time, to have remembered even these things, it will have been healing.Keywords: antiquity, carthage, empire, historiography, rome, ruination
Procedia PDF Downloads 181792 Algorithm Development of Individual Lumped Parameter Modelling for Blood Circulatory System: An Optimization Study
Authors: Bao Li, Aike Qiao, Gaoyang Li, Youjun Liu
Abstract:
Background: Lumped parameter model (LPM) is a common numerical model for hemodynamic calculation. LPM uses circuit elements to simulate the human blood circulatory system. Physiological indicators and characteristics can be acquired through the model. However, due to the different physiological indicators of each individual, parameters in LPM should be personalized in order for convincing calculated results, which can reflect the individual physiological information. This study aimed to develop an automatic and effective optimization method to personalize the parameters in LPM of the blood circulatory system, which is of great significance to the numerical simulation of individual hemodynamics. Methods: A closed-loop LPM of the human blood circulatory system that is applicable for most persons were established based on the anatomical structures and physiological parameters. The patient-specific physiological data of 5 volunteers were non-invasively collected as personalized objectives of individual LPM. In this study, the blood pressure and flow rate of heart, brain, and limbs were the main concerns. The collected systolic blood pressure, diastolic blood pressure, cardiac output, and heart rate were set as objective data, and the waveforms of carotid artery flow and ankle pressure were set as objective waveforms. Aiming at the collected data and waveforms, sensitivity analysis of each parameter in LPM was conducted to determine the sensitive parameters that have an obvious influence on the objectives. Simulated annealing was adopted to iteratively optimize the sensitive parameters, and the objective function during optimization was the root mean square error between the collected waveforms and data and simulated waveforms and data. Each parameter in LPM was optimized 500 times. Results: In this study, the sensitive parameters in LPM were optimized according to the collected data of 5 individuals. Results show a slight error between collected and simulated data. The average relative root mean square error of all optimization objectives of 5 samples were 2.21%, 3.59%, 4.75%, 4.24%, and 3.56%, respectively. Conclusions: Slight error demonstrated good effects of optimization. The individual modeling algorithm developed in this study can effectively achieve the individualization of LPM for the blood circulatory system. LPM with individual parameters can output the individual physiological indicators after optimization, which are applicable for the numerical simulation of patient-specific hemodynamics.Keywords: blood circulatory system, individual physiological indicators, lumped parameter model, optimization algorithm
Procedia PDF Downloads 1371791 Effect of Cuminum Cyminum L. Essential Oil on Staphylococcus Aureus during the Manufacture, Ripening and Storage of White Brined Cheese
Authors: Ali Misaghi, Afshin Akhondzadeh Basti, Ehsan Sadeghi
Abstract:
Staphylococcus aureus is a pathogen of major concern for clinical infection and food borne illness. Humans and most domesticated animals harbor S. aureus, and so we may expect staphylococci to be present in food products of animal origin or in those handled directly by humans, unless heat processing is applied to destroy them. Cuminum cyminum L. has been allocated the topic of some recent studies in addition to its well-documented traditional usage for treatment of toothache, dyspepsia, diarrhea, epilepsy and jaundice. The air-dried seed of the plant was completely immersed in water and subjected to hydro distillation for 3 h, using a clevenger-type apparatus. In this study, the effect of Cuminum cyminum L. essential oil (EO) on growth of Staphylococcus aureus in white brined cheese was evaluated. The experiment included different levels of EO (0, 7.5, 15 and 30 mL/ 100 mL milk) to assess their effects on S. aureus count during the manufacture, ripening and storage of Iranian white brined cheese for up to 75 days. The significant (P < 0.05) inhibitory effects of EO (even at its lowest concentration) on this organism were observed. The significant (P < 0.05) inhibitory effect of the EO on S. aureus shown in this study may improve the scope of the EO function in the food industry.Keywords: cuminum cyminum L. essential oil, staphylococcus aureus, white brined cheese
Procedia PDF Downloads 3891790 Performance Comparison of Thread-Based and Event-Based Web Servers
Authors: Aikaterini Kentroti, Theodore H. Kaskalis
Abstract:
Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.Keywords: apache, Go, Nginx, node.js, web server benchmarking
Procedia PDF Downloads 971789 Low Light Image Enhancement with Multi-Stage Interconnected Autoencoders Integration in Pix to Pix GAN
Authors: Muhammad Atif, Cang Yan
Abstract:
The enhancement of low-light images is a significant area of study aimed at enhancing the quality of captured images in challenging lighting environments. Recently, methods based on convolutional neural networks (CNN) have gained prominence as they offer state-of-the-art performance. However, many approaches based on CNN rely on increasing the size and complexity of the neural network. In this study, we propose an alternative method for improving low-light images using an autoencoder-based multiscale knowledge transfer model. Our method leverages the power of three autoencoders, where the encoders of the first two autoencoders are directly connected to the decoder of the third autoencoder. Additionally, the decoder of the first two autoencoders is connected to the encoder of the third autoencoder. This architecture enables effective knowledge transfer, allowing the third autoencoder to learn and benefit from the enhanced knowledge extracted by the first two autoencoders. We further integrate the proposed model into the PIX to PIX GAN framework. By integrating our proposed model as the generator in the GAN framework, we aim to produce enhanced images that not only exhibit improved visual quality but also possess a more authentic and realistic appearance. These experimental results, both qualitative and quantitative, show that our method is better than the state-of-the-art methodologies.Keywords: low light image enhancement, deep learning, convolutional neural network, image processing
Procedia PDF Downloads 801788 Laser Based Microfabrication of a Microheater Chip for Cell Culture
Authors: Daniel Nieto, Ramiro Couceiro
Abstract:
Microfluidic chips have demonstrated their significant application potentials in microbiological processing and chemical reactions, with the goal of developing monolithic and compact chip-sized multifunctional systems. Heat generation and thermal control are critical in some of the biochemical processes. The paper presents a laser direct-write technique for rapid prototyping and manufacturing of microheater chips and its applicability for perfusion cell culture outside a cell incubator. The aim of the microheater is to take the role of conventional incubators for cell culture for facilitating microscopic observation or other online monitoring activities during cell culture and provides portability of cell culture operation. Microheaters (5 mm × 5 mm) have been successfully fabricated on soda-lime glass substrates covered with aluminum layer of thickness 120 nm. Experimental results show that the microheaters exhibit good performance in temperature rise and decay characteristics, with localized heating at targeted spatial domains. These microheaters were suitable for a maximum long-term operation temperature of 120ºC and validated for long-time operation at 37ºC. for 24 hours. Results demonstrated that the physiology of the cultured SW480 adenocarcinoma of the colon cell line on the developed microheater chip was consistent with that of an incubator.Keywords: laser microfabrication, microheater, bioengineering, cell culture
Procedia PDF Downloads 2971787 Wear Assessment of SS316l-Al2O3 Composites for Heavy Wear Applications
Authors: Catherine Kuforiji, Michel Nganbe
Abstract:
The abrasive wear of composite materials is a major challenge in highly demanding wear applications. Therefore, this study focuses on fabricating, testing and assessing the properties of 50wt% SS316L stainless steel–50wt% Al2O3 particle composites. Composite samples were fabricated using the powder metallurgy route. The effects of the powder metallurgy processing parameters and hard particle reinforcement were studied. The microstructure, density, hardness and toughness were characterized. The wear behaviour was studied using pin-on-disc testing under dry sliding conditions. The highest hardness of 1085.2 HV, the highest theoretical density of 94.7% and the lowest wear rate of 0.00397 mm3/m were obtained at a milling speed of 720 rpm, a compaction pressure of 794.4 MPa and sintering at 1400 °C in an argon atmosphere. Compared to commercial SS316 and fabricated SS316L, the composites had 7.4 times and 11 times lower wear rate, respectively. However, the commercial 90WC-10Co showed 2.2 times lower wear rate compared to the fabricated SS316L-Al2O3 composites primarily due to the higher ceramic content of 90 wt.% in the reference WC-Co. However, eliminating the relatively high porosity of about 5 vol% using processes such as HIP and hot pressing can be expected to lead to further substantial improvements of the composites wear resistance.Keywords: SS316L, Al2O3, powder metallurgy, wear characterization
Procedia PDF Downloads 3041786 Futuristic Black Box Design Considerations and Global Networking for Real Time Monitoring of Flight Performance Parameters
Authors: K. Parandhama Gowd
Abstract:
The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.Keywords: flight data recorder (FDR), black box, diagnostic tool, global networking, cockpit voice and data recorder (CVDR), air traffic control (ATC), air traffic, telemetry, tracking and control centers ATTTCC)
Procedia PDF Downloads 5721785 Molecular Epidemiology of Egyptian Biomphalaria Snail: The Identification of Species, Diagnostic of the Parasite in Snails and Host Parasite Relationship
Authors: Hanaa M. Abu El Einin, Ahmed T. Sharaf El- Din
Abstract:
Biomphalaria snails play an integral role in the transmission of Schistosoma mansoni, the causative agent for human schistosomiasis. Two species of Biomphalaria were reported from Egypt, Biomphalaria alexandrina and Biomphalaria glabrata, and later on a hybrid of B. alexandrina and B. glabrata was reported in streams at Nile Delta. All were known to be excellent hosts of S. mansoni. Host-parasite relationship can be viewed in terms of snail susceptibility and parasite infectivity. The objective of this study will highlight the progress that has been made in using molecular approaches to describe the correct identification of snail species that participating in transmission of schistosomiasis, rapid diagnose of infection in addition to susceptibility and resistance type. Snails were identified using of molecular methods involving Randomly Amplified Polymorphic DNA (RAPD), Polymerase Chain Reaction, Restriction Fragment Length Polymorphisms (PCR-RFLP) and Species - specific- PCR. Molecular approaches to diagnose parasite in snails from Egypt: Nested PCR assay and small subunit (SSU) rRNA gene. Also RAPD PCR for study susceptible and resistance phenotype. The results showed that RAPD- PCR, PCR-RFLP and species-specific-PCR techniques were confirmed that: no evidence for the presence of B. glabrata in Egypt, All Biomphalaria snails collected identified as B. alexandrina snail i-e B alexandrinia is a common and no evidence for hybridization with B. glabrata. The adopted specific nested PCR assay revealed much higher sensitivity which enables the detection of S. mansoni infected snails down to 3 days post infection. Nested PCR method for detection of infected snails using S. mansoni fructose -1,6- bisphosphate aldolase (SMALDO) primer, these primers are specific only for S. mansoni and not cross reactive with other schistosomes or molluscan aldolases Nested PCR for such gene is sensitive enough to detect one cercariae. Genetic variations between B. alexandrina strains that are susceptible and resistant to Schistosoma infec¬tion using a RAPD-PCR showed that 39.8% of the examined snails collected from the field were resistant, while 60.2% of these snails showed high infection rates. In conclusion the genetics of the intermediate host plays a more important role in the epidemiological control of schistosomiasis.Keywords: biomphalaria, molecular differentiation, parasite detection, schistosomiasis
Procedia PDF Downloads 1981784 Adaptive Swarm Balancing Algorithms for Rare-Event Prediction in Imbalanced Healthcare Data
Authors: Jinyan Li, Simon Fong, Raymond Wong, Mohammed Sabah, Fiaidhi Jinan
Abstract:
Clinical data analysis and forecasting have make great contributions to disease control, prevention and detection. However, such data usually suffer from highly unbalanced samples in class distributions. In this paper, we target at the binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat-inspired algorithm, and combine both of them with the synthetic minority over-sampling technique (SMOTE) for processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reveal that while the performance improvements obtained by the former methods are not scalable to larger data scales, the later one, which we call Adaptive Swarm Balancing Algorithms, leads to significant efficiency and effectiveness improvements on large datasets. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. Leading to more credible performances of the classifier, and shortening the running time compared with the brute-force method.Keywords: Imbalanced dataset, meta-heuristic algorithm, SMOTE, big data
Procedia PDF Downloads 4411783 The Reflexive Interaction in Group Formal Practices: The Question of Criteria and Instruments for the Character-Skills Evaluation
Authors: Sara Nosari
Abstract:
In the research field on adult education, the learning development project followed different itineraries: recently it has promoted adult transformation by practices focused on the reflexive oriented interaction. This perspective, that connects life stories and life-based methods, characterizes a transformative space between formal and informal education. Within this framework, in the Nursing Degree Courses of Turin University, it has been discussed and realized a formal reflexive path on the care work professional identity through group practices. This path compared the future care professionals with possible experiences staged by texts used with the function of a pre-tests: these texts, setting up real or believable professional situations, had the task to start a reflection on the different 'elements' of care work professional life (relationship, educational character of relationship, relationship between different care roles; or even human identity, aims and ultimate aim of care, …). The learning transformative aspect of this kind of experience-test is that it is impossible to anticipate the process or the conclusion of reflexion because they depend on two main conditions: the personal sensitivity and the specific situation. The narrated experience is not a device, it does not include any tricks to understand the answering advance; the text is not aimed at deepening the knowledge, but at being an active and creative force which takes the group to compare with problematic figures. In fact, the experience-text does not have the purpose to explain but to problematize: it creates a space of suspension to live for questioning, for discussing, for researching, for deciding. It creates a space 'open' and 'in connection' where each one, in comparing with others, has the possibility to build his/her position. In this space, everyone has to possibility to expose his/her own argumentations and to be aware of the others emerged points of view, aiming to research and find the own personal position. However, to define his/her position, it is necessary to learn to exercise character skills (conscientiousness, motivation, creativity, critical thinking, …): if these not-cognitive skills have an undisputed evidence, less evident is how to value them. The paper will reflect on the epistemological limits and possibility to 'measure' character skills, suggesting some evaluation criteria.Keywords: transformative learning, educational role, formal/informal education, character-skills
Procedia PDF Downloads 1931782 Factors Influencing the Logistics Services Providers' Performance: A Literature Overview
Authors: A. Aguezzoul
Abstract:
The Logistics Services Providers (LSPs) selection and performance is a strategic decision that affects the overall performance of any company as well as its supply chain. It is a complex process, which takes into account various conflicting quantitative and qualitative factors, as well as outsourced logistics activities. This article focuses on the evolution of the weights associated to these factors over the last years in order to better understand the change in the importance that logistics professionals place on them criteria when choosing their LSPs. For that, an analysis of 17 main studies published during 2014-2017 period was carried out and the results are compared to those of a previous literature review on this subject. Our analysis allowed us to deduce the following observations: 1) the LSPs selection is a multi-criteria process; 2) the empirical character of the majority of studies, conducted particularly in Asian countries; 3) the criteria importance has undergone significant changes following the emergence of information technologies that have favored the work in close collaboration and in partnership between the LSPs and their customers, even on a worldwide scale; 4) the cost criterion is relatively less important than in the past; and finally 5) with the development of sustainable supply chains, the factors associated with the logistic activities of return and waste processing (reverse logistics) are becoming increasingly important in this multi-criteria process of selection and evaluation of LSPs performance.Keywords: logistics outsourcing, logistics providers, multi-criteria decision making, performance
Procedia PDF Downloads 1541781 Analysis of Airborne Data Using Range Migration Algorithm for the Spotlight Mode of Synthetic Aperture Radar
Authors: Peter Joseph Basil Morris, Chhabi Nigam, S. Ramakrishnan, P. Radhakrishna
Abstract:
This paper brings out the analysis of the airborne Synthetic Aperture Radar (SAR) data using the Range Migration Algorithm (RMA) for the spotlight mode of operation. Unlike in polar format algorithm (PFA), space-variant defocusing and geometric distortion effects are mitigated in RMA since it does not assume that the illuminating wave-fronts are planar. This facilitates the use of RMA for imaging scenarios involving severe differential range curvatures enabling the imaging of larger scenes at fine resolution and at shorter ranges with low center frequencies. The RMA algorithm for the spotlight mode of SAR is analyzed in this paper using the airborne data. Pre-processing operations viz: - range de-skew and motion compensation to a line are performed on the raw data before being fed to the RMA component. Various stages of the RMA viz:- 2D Matched Filtering, Along Track Fourier Transform and Slot Interpolation are analyzed to find the performance limits and the dependence of the imaging geometry on the resolution of the final image. The ability of RMA to compensate for severe differential range curvatures in the two-dimensional spatial frequency domain are also illustrated in this paper.Keywords: range migration algorithm, spotlight SAR, synthetic aperture radar, matched filtering, slot interpolation
Procedia PDF Downloads 2411780 Drug Susceptibility and Genotypic Assessment of Mycobacterial Isolates from Pulmonary Tuberculosis Patients in North East Ethiopia
Authors: Minwuyelet Maru, Solomon Habtemariam, Endalamaw Gadissa, Abraham Aseffa
Abstract:
Background: Tuberculosis is a major public health problem in Ethiopia. The burden of TB is aggravated by emergence and expansion of drug resistant tuberculosis and different lineages of Mycobacterium tuberculosis (M. tuberculosis) have been reported in many parts of the country. Describing strains of Mycobacterial isolates and drug susceptibility pattern is necessary. Method: Sputum samples were collected from smear positive pulmonary TB patients age >= 7 years between October 1, 2012 to September 30, 2013 and Mycobacterial strains isolated on Loweensten Jensen (LJ) media. Each strain was characterized by deletion typing and Spoligotyping. Drug sensitivity testing was determined with the indirect proportion method using Middle brook 7H10 media and association to determine possible risk factors to drug resistance was done. Result: A total of 144 smear positive pulmonary tuberculosis patients were enrolled. The age of participants ranged from 7 to 78 with mean age of 29.22 (±10.77) years. In this study 82.2% (n=97) of the isolates were sensitive to the four first line anti-tuberculosis drugs and resistance to any of the four drugs tested was 17.8% (n=21). A high frequency of any resistance was observed in isoniazid, 13.6%, (n=16) followed by streptomycin, 11.8% (n=14). No significant association of isoniazid resistance with HIV, sex and history of previous TB treatment was observed but there was significant association with age, high between 31-35 years of age (p=0.01). Majority, 89.9% (n=128) of participants were new cases and only 11.1% (n=16) had history of previous TB treatment. No MDR-TB from new cases and 2 MDRTB (13.3%) was isolated from re-treatment cases which was significantly associated with previous TB treatment (p<0.01). Thirty two different types of spoligotype patterns were identified and 74.1% were grouped in to 13 clusters. The dominant strains were SIT 25, 18.1% (n=21), SIT 53, 17.2% (n=20) and SIT 149, 8.6% (n=10). Lineage 4 is the predominant lineage followed by lineage 3 and lineage 7 comprising 65.5% (n=76), 28.4% (n=33) and 6% (n=7) respectively. Majority of strains from lineage 3 and 4 were SIT 25 (63.6%) and SIT 53 (26.3%) whereas SIT 343 was the dominant strain from lineage 7 (71.4%). Conclusion: Wide spread of lineage 3 and lineage 4 of the modern lineage and high number of strain cluster indicates high ongoing transmission. The high proportion resistance to any of the first line anti-tuberculosis drugs may be a potential source in the emergence of MDR-TB. Wide spread of SIT 25 and SIT 53 having a tendency of ease transmission and presence of higher resistance of isoniazid in working and mobile age group, 31-35 years of age may increase risk of drug resistant strains transmission.Keywords: tuberculosis, drug susceptibility, strain diversity, lineage, Ethiopia, spoligotyping
Procedia PDF Downloads 3751779 Decision-Making in Higher Education: Case Studies Demonstrating the Value of Institutional Effectiveness Tools
Authors: Carolinda Douglass
Abstract:
Institutional Effectiveness (IE) is the purposeful integration of functions that foster student success and support institutional performance. IE is growing rapidly within higher education as it is increasingly viewed by higher education administrators as a beneficial approach for promoting data-informed decision-making in campus-wide strategic planning and execution of strategic initiatives. Specific IE tools, including, but not limited to, project management; impactful collaboration and communication; commitment to continuous quality improvement; and accountability through rigorous evaluation; are gaining momentum under the auspices of IE. This research utilizes a case study approach to examine the use of these IE tools, highlight successes of this use, and identify areas for improvement in the implementation of IE tools within higher education. The research includes three case studies: (1) improving upon academic program review processes including the assessment of student learning outcomes as a core component of program quality; (2) revising an institutional vision, mission, and core values; and (3) successfully navigating an institution-wide re-accreditation process. Several methods of data collection are embedded within the case studies, including surveys, focus groups, interviews, and document analyses. Subjects of these methods include higher education administrators, faculty, and staff. Key findings from the research include areas of success and areas for improvement in the use of IE tools associated with specific case studies as well as aggregated results across case studies. For example, the use of case management proved useful in all of the case studies, while rigorous evaluation did not uniformly provide the value-added that was expected by higher education decision-makers. The use of multiple IE tools was shown to be consistently useful in decision-making when applied with appropriate awareness of and sensitivity to core institutional culture (for example, institutional mission, local environments and communities, disciplinary distinctions, and labor relations). As IE gains a stronger foothold in higher education, leaders in higher education can make judicious use of IE tools to promote better decision-making and secure improved outcomes of strategic planning and the execution of strategic initiatives.Keywords: accreditation, data-informed decision-making, higher education management, institutional effectiveness tools, institutional mission, program review, strategic planning
Procedia PDF Downloads 1161778 Neuroblastoma in Children and the Potential Involvement of Viruses in Its Pathogenesis
Authors: Ugo Rovigatti
Abstract:
Neuroblastoma (NBL) has epitomized for at least 40 years our understanding of cancer cellular and molecular biology and its potential applications to novel therapeutic strategies. This includes the discovery of the very first oncogene aberrations and tumorigenesis suppression by differentiation in the 80s; the potential role of suppressor genes in the 90s; the relevance of immunotherapy in the millennium first, and the discovery of additional mutations by NGS technology in the millennium second decade. Similar discoveries were achieved in the majority of human cancers, and similar therapeutic interventions were obtained subsequently to NBL discoveries. Unfortunately, targeted therapies suggested by specific mutations (such as MYCN amplification –MNA- present in ¼ or 1/5 of cases) have not elicited therapeutic successes in aggressive NBL, where the prognosis is still dismal. The reasons appear to be linked to Tumor Heterogeneity, which is particularly evident in NBL but also a clear hallmark of aggressive human cancers generally. The new avenue of cancer immunotherapy (CIT) provided new hopes for cancer patients, but we still ignore the cellular or molecular targets. CIT is emblematic of high-risk disease (HR-NBL) since the mentioned GD2 passive immunotherapy is still providing better survival. We recently critically reviewed and evaluated the literature depicting the genomic landscapes of HR-NBL, coming to the qualified conclusion that among hundreds of affected genes, potential targets, or chromosomal sites, none correlated with anti-GD2 sensitivity. A better explanation is provided by the Micro-Foci inducing Virus (MFV) model, which predicts that neuroblasts infection with the MFV, an RNA virus isolated from a cancer-cluster (space-time association) of HR-NBL cases, elicits the appearance of MNA and additional genomic aberrations with mechanisms resembling chromothripsis. Neuroblasts infected with low titers of MFV amplified MYCN up to 100 folds and became highly transformed and malignant, thus causing neuroblastoma in young rat pups of strains SD and Fisher-344 and larger tumor masses in nu/nu mice. An association was discovered with GD2 since this glycosphingolipid is also the receptor for the family of MFV virus (dsRNA viruses). It is concluded that a dsRNA virus, MFV, appears to provide better explicatory mechanisms for the genesis of i) specific genomic aberrations such as MNA; ii) extensive tumor heterogeneity and chromothripsis; iii) the effects of passive immunotherapy with anti-GD2 monoclonals and that this and similar models should be further investigated in both pediatric and adult cancers.Keywords: neuroblastoma, MYCN, amplification, viruses, GD2
Procedia PDF Downloads 1001777 Optimization of Temperature Coefficients for MEMS Based Piezoresistive Pressure Sensor
Authors: Vijay Kumar, Jaspreet Singh, Manoj Wadhwa
Abstract:
Piezo-resistive pressure sensors were one of the first developed micromechanical system (MEMS) devices and still display a significant growth prompted by the advancements in micromachining techniques and material technology. In MEMS based piezo-resistive pressure sensors, temperature can be considered as the main environmental condition which affects the system performance. The study of the thermal behavior of these sensors is essential to define the parameters that cause the output characteristics to drift. In this work, a study on the effects of temperature and doping concentration in a boron implanted piezoresistor for a silicon-based pressure sensor is discussed. We have optimized the temperature coefficient of resistance (TCR) and temperature coefficient of sensitivity (TCS) values to determine the effect of temperature drift on the sensor performance. To be more precise, in order to reduce the temperature drift, a high doping concentration is needed. And it is well known that the Wheatstone bridge in a pressure sensor is supplied with a constant voltage or a constant current input supply. With a constant voltage supply, the thermal drift can be compensated along with an external compensation circuit, whereas the thermal drift in the constant current supply can be directly compensated by the bridge itself. But it would be beneficial to also compensate the temperature coefficient of piezoresistors so as to further reduce the temperature drift. So, with a current supply, the TCS is dependent on both the TCπ and TCR. As TCπ is a negative quantity and TCR is a positive quantity, it is possible to choose an appropriate doping concentration at which both of them cancel each other. An exact cancellation of TCR and TCπ values is not readily attainable; therefore, an adjustable approach is generally used in practical applications. Thus, one goal of this work has been to better understand the origin of temperature drift in pressure sensor devices so that the temperature effects can be minimized or eliminated. This paper describes the optimum doping levels for the piezoresistors where the TCS of the pressure transducers will be zero due to the cancellation of TCR and TCπ values. Also, the fabrication and characterization of the pressure sensor are carried out. The optimized TCR value obtained for the fabricated die is 2300 ± 100ppm/ᵒC, for which the piezoresistors are implanted at a doping concentration of 5E13 ions/cm³ and the TCS value of -2100ppm/ᵒC is achieved. Therefore, the desired TCR and TCS value is achieved, which are approximately equal to each other, so the thermal effects are considerably reduced. Finally, we have calculated the effect of temperature and doping concentration on the output characteristics of the sensor. This study allows us to predict the sensor behavior against temperature and to minimize this effect by optimizing the doping concentration.Keywords: piezo-resistive, pressure sensor, doping concentration, TCR, TCS
Procedia PDF Downloads 1811776 Nagabhasma Preparation and Its Effect on Kidneys: A Histopathological Study
Authors: Lydia Andrade, Kumar M. R. Bhat
Abstract:
Heavy metals, especially lead, is considered to be a multi-organ toxicant. However, such heavy metals, are used in the preparation of traditional medicines. Nagabhasma is one of the traditional medicines. Lead is the metal used in its preparation. Lead is converted into a health beneficial, organometallic compound, when subjected to various traditional methods of purification. Therefore, this study is designed to evaluate the effect of such processed lead in various stages of traditionally prepared Nagabhasma on the histological structure of kidneys. Using the human equivalent doses of Nagabhasma, various stages of its preparation were fed orally for 30 days and 60 days (short term and long term). The treated and untreated rats were then sacrificed for the collection of kidneys. The kidneys were processed for histopathological study. The results show severe changes in the histological structure of kidneys. The animals treated with lead acetate showed changes in the epithelial cells lining the bowman’s capsule. The proximal and distal convoluted tubules were dilated leading to atrophy of their epithelial cells. The amount of inflammatory infiltrates was more in this group. A few groups also showed pockets of inter-tubular hemorrhage. These changes, however, were minimized as the stages progressed form stages 1 to 4 of Nagabhasma preparation. Therefore, it is necessary to stringently monitor the processing of lead acetate during the preparation of Nagabhasma.Keywords: heavy metals, kidneys, lead acetate, Nagabhasma
Procedia PDF Downloads 1461775 3D Human Reconstruction over Cloud Based Image Data via AI and Machine Learning
Authors: Kaushik Sathupadi, Sandesh Achar
Abstract:
Human action recognition modeling is a critical task in machine learning. These systems require better techniques for recognizing body parts and selecting optimal features based on vision sensors to identify complex action patterns efficiently. Still, there is a considerable gap and challenges between images and videos, such as brightness, motion variation, and random clutters. This paper proposes a robust approach for classifying human actions over cloud-based image data. First, we apply pre-processing and detection, human and outer shape detection techniques. Next, we extract valuable information in terms of cues. We extract two distinct features: fuzzy local binary patterns and sequence representation. Then, we applied a greedy, randomized adaptive search procedure for data optimization and dimension reduction, and for classification, we used a random forest. We tested our model on two benchmark datasets, AAMAZ and the KTH Multi-view football datasets. Our HMR framework significantly outperforms the other state-of-the-art approaches and achieves a better recognition rate of 91% and 89.6% over the AAMAZ and KTH multi-view football datasets, respectively.Keywords: computer vision, human motion analysis, random forest, machine learning
Procedia PDF Downloads 361774 Automatic Generating CNC-Code for Milling Machine
Authors: Chalakorn Chitsaart, Suchada Rianmora, Mann Rattana-Areeyagon, Wutichai Namjaiprasert
Abstract:
G-code is the main factor in computer numerical control (CNC) machine for controlling the tool-paths and generating the profile of the object’s features. For obtaining high surface accuracy of the surface finish, non-stop operation is required for CNC machine. Recently, to design a new product, the strategy that concerns about a change that has low impact on business and does not consume lot of resources has been introduced. Cost and time for designing minor changes can be reduced since the traditional geometric details of the existing models are applied. In order to support this strategy as the alternative channel for machining operation, this research proposes the automatic generating codes for CNC milling operation. Using this technique can assist the manufacturer to easily change the size and the geometric shape of the product during the operation where the time spent for setting up or processing the machine are reduced. The algorithm implemented on MATLAB platform is developed by analyzing and evaluating the geometric information of the part. Codes are created rapidly to control the operations of the machine. Comparing to the codes obtained from CAM, this developed algorithm can shortly generate and simulate the cutting profile of the part.Keywords: geometric shapes, milling operation, minor changes, CNC Machine, G-code, cutting parameters
Procedia PDF Downloads 3491773 Information Theoretic Approach for Beamforming in Wireless Communications
Authors: Syed Khurram Mahmud, Athar Naveed, Shoaib Arif
Abstract:
Beamforming is a signal processing technique extensively utilized in wireless communications and radars for desired signal intensification and interference signal minimization through spatial selectivity. In this paper, we present a method for calculation of optimal weight vectors for smart antenna array, to achieve a directive pattern during transmission and selective reception in interference prone environment. In proposed scheme, Mutual Information (MI) extrema are evaluated through an energy constrained objective function, which is based on a-priori information of interference source and desired array factor. Signal to Interference plus Noise Ratio (SINR) performance is evaluated for both transmission and reception. In our scheme, MI is presented as an index to identify trade-off between information gain, SINR, illumination time and spatial selectivity in an energy constrained optimization problem. The employed method yields lesser computational complexity, which is presented through comparative analysis with conventional methods in vogue. MI based beamforming offers enhancement of signal integrity in degraded environment while reducing computational intricacy and correlating key performance indicators.Keywords: beamforming, interference, mutual information, wireless communications
Procedia PDF Downloads 2801772 Predicting Open Chromatin Regions in Cell-Free DNA Whole Genome Sequencing Data by Correlation Clustering
Authors: Fahimeh Palizban, Farshad Noravesh, Amir Hossein Saeidian, Mahya Mehrmohamadi
Abstract:
In the recent decade, the emergence of liquid biopsy has significantly improved cancer monitoring and detection. Dying cells, including those originating from tumors, shed their DNA into the blood and contribute to a pool of circulating fragments called cell-free DNA. Accordingly, identifying the tissue origin of these DNA fragments from the plasma can result in more accurate and fast disease diagnosis and precise treatment protocols. Open chromatin regions are important epigenetic features of DNA that reflect cell types of origin. Profiling these features by DNase-seq, ATAC-seq, and histone ChIP-seq provides insights into tissue-specific and disease-specific regulatory mechanisms. There have been several studies in the area of cancer liquid biopsy that integrate distinct genomic and epigenomic features for early cancer detection along with tissue of origin detection. However, multimodal analysis requires several types of experiments to cover the genomic and epigenomic aspects of a single sample, which will lead to a huge amount of cost and time. To overcome these limitations, the idea of predicting OCRs from WGS is of particular importance. In this regard, we proposed a computational approach to target the prediction of open chromatin regions as an important epigenetic feature from cell-free DNA whole genome sequence data. To fulfill this objective, local sequencing depth will be fed to our proposed algorithm and the prediction of the most probable open chromatin regions from whole genome sequencing data can be carried out. Our method integrates the signal processing method with sequencing depth data and includes count normalization, Discrete Fourie Transform conversion, graph construction, graph cut optimization by linear programming, and clustering. To validate the proposed method, we compared the output of the clustering (open chromatin region+, open chromatin region-) with previously validated open chromatin regions related to human blood samples of the ATAC-DB database. The percentage of overlap between predicted open chromatin regions and the experimentally validated regions obtained by ATAC-seq in ATAC-DB is greater than 67%, which indicates meaningful prediction. As it is evident, OCRs are mostly located in the transcription start sites (TSS) of the genes. In this regard, we compared the concordance between the predicted OCRs and the human genes TSS regions obtained from refTSS and it showed proper accordance around 52.04% and ~78% with all and the housekeeping genes, respectively. Accurately detecting open chromatin regions from plasma cell-free DNA-seq data is a very challenging computational problem due to the existence of several confounding factors, such as technical and biological variations. Although this approach is in its infancy, there has already been an attempt to apply it, which leads to a tool named OCRDetector with some restrictions like the need for highly depth cfDNA WGS data, prior information about OCRs distribution, and considering multiple features. However, we implemented a graph signal clustering based on a single depth feature in an unsupervised learning manner that resulted in faster performance and decent accuracy. Overall, we tried to investigate the epigenomic pattern of a cell-free DNA sample from a new computational perspective that can be used along with other tools to investigate genetic and epigenetic aspects of a single whole genome sequencing data for efficient liquid biopsy-related analysis.Keywords: open chromatin regions, cancer, cell-free DNA, epigenomics, graph signal processing, correlation clustering
Procedia PDF Downloads 1501771 Non-Contact Measurement of Soil Deformation in a Cyclic Triaxial Test
Authors: Erica Elice Uy, Toshihiro Noda, Kentaro Nakai, Jonathan Dungca
Abstract:
Deformation in a conventional cyclic triaxial test is normally measured by using point-wise measuring device. In this study, non-contact measurement technique was applied to be able to monitor and measure the occurrence of non-homogeneous behavior of the soil under cyclic loading. Non-contact measurement is executed through image processing. Two-dimensional measurements were performed using Lucas and Kanade optical flow algorithm and it was implemented Labview. In this technique, the non-homogeneous deformation was monitored using a mirrorless camera. A mirrorless camera was used because it is economical and it has the capacity to take pictures at a fast rate. The camera was first calibrated to remove the distortion brought about the lens and the testing environment as well. Calibration was divided into 2 phases. The first phase was the calibration of the camera parameters and distortion caused by the lens. The second phase was to for eliminating the distortion brought about the triaxial plexiglass. A correction factor was established from this phase. A series of consolidated undrained cyclic triaxial test was performed using a coarse soil. The results from the non-contact measurement technique were compared to the measured deformation from the linear variable displacement transducer. It was observed that deformation was higher at the area where failure occurs.Keywords: cyclic loading, non-contact measurement, non-homogeneous, optical flow
Procedia PDF Downloads 3011770 Stabilizing Additively Manufactured Superalloys at High Temperatures
Authors: Keivan Davami, Michael Munther, Lloyd Hackel
Abstract:
The control of properties and material behavior by implementing thermal-mechanical processes is based on mechanical deformation and annealing according to a precise schedule that will produce a unique and stable combination of grain structure, dislocation substructure, texture, and dispersion of precipitated phases. The authors recently developed a thermal-mechanical technique to stabilize the microstructure of additively manufactured nickel-based superalloys even after exposure to high temperatures. However, the mechanism(s) that controls this stability is still under investigation. Laser peening (LP), also called laser shock peening (LSP), is a shock based (50 ns duration) post-processing technique used for extending performance levels and improving service life of critical components by developing deep levels of plastic deformation, thereby generating high density of dislocations and inducing compressive residual stresses in the surface and deep subsurface of components. These compressive residual stresses are usually accompanied with an increase in hardness and enhance the material’s resistance to surface-related failures such as creep, fatigue, contact damage, and stress corrosion cracking. While the LP process enhances the life span and durability of the material, the induced compressive residual stresses relax at high temperatures (>0.5Tm, where Tm is the absolute melting temperature), limiting the applicability of the technology. At temperatures above 0.5Tm, the compressive residual stresses relax, and yield strength begins to drop dramatically. The principal reason is the increasing rate of solid-state diffusion, which affects both the dislocations and the microstructural barriers. Dislocation configurations commonly recover by mechanisms such as climbing and recombining rapidly at high temperatures. Furthermore, precipitates coarsen, and grains grow; virtually all of the available microstructural barriers become ineffective.Our results indicate that by using “cyclic” treatments with sequential LP and annealing steps, the compressive stresses survive, and the microstructure is stable after exposure to temperatures exceeding 0.5Tm for a long period of time. When the laser peening process is combined with annealing, dislocations formed as a result of LPand precipitates formed during annealing have a complex interaction that provides further stability at high temperatures. From a scientific point of view, this research lays the groundwork for studying a variety of physical, materials science, and mechanical engineering concepts. This research could lead to metals operating at higher sustained temperatures enabling improved system efficiencies. The strengthening of metals by a variety of means (alloying, work hardening, and other processes) has been of interest for a wide range of applications. However, the mechanistic understanding of the often complex processes of interactionsbetween dislocations with solute atoms and with precipitates during plastic deformation have largely remained scattered in the literature. In this research, the elucidation of the actual mechanisms involved in the novel cyclic LP/annealing processes as a scientific pursuit is investigated through parallel studies of dislocation theory and the implementation of advanced experimental tools. The results of this research help with the validation of a novel laser processing technique for high temperature applications. This will greatly expand the applications of the laser peening technology originally devised only for temperatures lower than half of the melting temperature.Keywords: laser shock peening, mechanical properties, indentation, high temperature stability
Procedia PDF Downloads 1491769 The Evolution of the Human Brain from the Hind Brain to the Fore Brain: Dialectics from the African Perspective in Understanding Stunted Development in Science and Technology
Authors: Philemon Wokoma Iyagba, Obey Onenee Christie
Abstract:
From the hindbrain, which is responsible for motor activities, to the forebrain, responsible for processing information related to complex cognitive activities, the human brain has continued to evolve over the years. This evolution- has been progressive, leading to advancements in science and technology. However, the development of science and technology in Africa, where ancient civilization arguably began, has been retrogressive. Dialectics was done by dissecting different opinions on the reason behind the stunted development of science and technology in Africa. The researchers proposed that the inability to sustain the technological advancements made by early Africans is due to poor or lack of replicability of the African knowledge-based system, almost no or poor documentation of adopted procedures and the approval-seeking mentality that cheaply paved the way for westernization which also led to the adulteration of the African way of life and education without making room for incorporating her identity and proper alignment of her rich cultural heritage in education and her enormous achievements before and during the middle age. This article discussed conceptual issues, with its positions based on established facts, the discussion was based on relevant literature and recommendations were made accordingly.Keywords: forebrain, hindbrain, dialectics from African perspective, development in science and technology
Procedia PDF Downloads 771768 Non-Targeted Adversarial Image Classification Attack-Region Modification Methods
Authors: Bandar Alahmadi, Lethia Jackson
Abstract:
Machine Learning model is used today in many real-life applications. The safety and security of such model is important, so the results of the model are as accurate as possible. One challenge of machine learning model security is the adversarial examples attack. Adversarial examples are designed by the attacker to cause the machine learning model to misclassify the input. We propose a method to generate adversarial examples to attack image classifiers. We are modifying the successfully classified images, so a classifier misclassifies them after the modification. In our method, we do not update the whole image, but instead we detect the important region, modify it, place it back to the original image, and then run it through a classifier. The algorithm modifies the detected region using two methods. First, it will add abstract image matrix on back of the detected image matrix. Then, it will perform a rotation attack to rotate the detected region around its axes, and embed the trace of image in image background. Finally, the attacked region is placed in its original position, from where it was removed, and a smoothing filter is applied to smooth the background with foreground. We test our method in cascade classifier, and the algorithm is efficient, the classifier confident has dropped to almost zero. We also try it in CNN (Convolutional neural network) with higher setting and the algorithm was successfully worked.Keywords: adversarial examples, attack, computer vision, image processing
Procedia PDF Downloads 3391767 The Global-Local Dimension in Cognitive Control after Left Lateral Prefrontal Cortex Damage: Evidence from the Non-Verbal Domain
Authors: Eleni Peristeri, Georgia Fotiadou, Ianthi-Maria Tsimpli
Abstract:
The local-global dimension has been studied extensively in healthy controls and preference for globally processed stimuli has been validated in both the visual and auditory modalities. Critically, the local-global dimension has an inherent interference resolution component, a type of cognitive control, and left-prefrontal-cortex-damaged (LPFC) individuals have exhibited inability to override habitual response behaviors in item recognition tasks that involve representational interference. Eight patients with damage in the left PFC (age range: 32;5 to 69;0. Mean age: 54;6 yrs) and twenty age- and education-matched language-unimpaired adults (mean age: 56;7yrs) have participated in the study. Distinct performance patterns were found between the language-unimpaired and the LPFC-damaged group which have mainly stemmed from the latter’s difficulty with inhibiting global stimuli in incongruent trials. Overall, the local-global attentional dimension affects LPFC-damaged individuals with non-fluent aphasia in non-language domains implicating distinct types of inhibitory processes depending on the level of processing.Keywords: left lateral prefrontal cortex damage (LPFC), local-global non-language attention, representational interference, non-fluent aphasia
Procedia PDF Downloads 4701766 Denoising of Motor Unit Action Potential Based on Tunable Band-Pass Filter
Authors: Khalida S. Rijab, Mohammed E. Safi, Ayad A. Ibrahim
Abstract:
When electrical electrodes are mounted on the skin surface of the muscle, a signal is detected when a skeletal muscle undergoes contraction; the signal is known as surface electromyographic signal (EMG). This signal has a noise-like interference pattern resulting from the temporal and spatial summation of action potentials (AP) of all active motor units (MU) near electrode detection. By appropriate processing (Decomposition), the surface EMG signal may be used to give an estimate of motor unit action potential. In this work, a denoising technique is applied to the MUAP signals extracted from the spatial filter (IB2). A set of signals from a non-invasive two-dimensional grid of 16 electrodes from different types of subjects, muscles, and sex are recorded. These signals will acquire noise during recording and detection. A digital fourth order band- pass Butterworth filter is used for denoising, with a tuned band-pass frequency of suitable choice of cutoff frequencies is investigated, with the aim of obtaining a suitable band pass frequency. Results show an improvement of (1-3 dB) in the signal to noise ratio (SNR) have been achieved, relative to the raw spatial filter output signals for all cases that were under investigation. Furthermore, the research’s goal included also estimation and reconstruction of the mean shape of the MUAP.Keywords: EMG, Motor Unit, Digital Filter, Denoising
Procedia PDF Downloads 4011765 Inflammatory Alleviation on Microglia Cells by an Apoptotic Mimicry
Authors: Yi-Feng Kao, Huey-Jine Chai, Chin-I Chang, Yi-Chen Chen, June-Ru Chen
Abstract:
Microglia is a macrophage that resides in brain, and overactive microglia may result in brain neuron damage or inflammation. In this study, the phospholipids was extracted from squid skin and manufactured into a liposome (SQ liposome) to mimic apoptotic body. We then evaluated anti-inflammatory effects of SQ liposome on mouse microglial cell line (BV-2) by lipopolysaccharide (LPS) induction. First, the major phospholipid constituents in the squid skin extract were including 46.2% of phosphatidylcholine, 18.4% of phosphatidylethanolamine, 7.7% of phosphatidylserine, 3.5% of phosphatidylinositol, 4.9% of Lysophosphatidylcholine and 19.3% of other phospholipids by HPLC-UV analysis. The contents of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) in the squid skin extract were 11.8 and 28.7%, respectively. The microscopic images showed that microglia cells can engulf apoptotic cells or SQ-liposome. In cell based studies, there was no cytotoxicity to BV-2 as the concentration of SQ-liposome was less than 2.5 mg/mL. The LPS induced pro-inflammatory cytokines, including tumor necrosis factor-alpha (TNF-α) and interleukin-6 (IL-6), were significant suppressed (P < 0.05) by pretreated 0.03~2.5mg/ml SQ liposome. Oppositely, the anti-inflammatory cytokines transforming growth factor-beta (TGF-β) and interleukin-10 (IL-10) secretion were enhanced (P < 0.05). The results suggested that SQ-liposome possess anti-inflammatory properties on BV-2 and may be a good strategy for against neuro-inflammatory disease.Keywords: apoptotic mimicry, neuroinflammation, microglia, squid processing by-products
Procedia PDF Downloads 483