Search results for: genotype identification
551 Developing Optical Sensors with Application of Cancer Detection by Elastic Light Scattering Spectroscopy
Authors: May Fadheel Estephan, Richard Perks
Abstract:
Context: Cancer is a serious health concern that affects millions of people worldwide. Early detection and treatment are essential for improving patient outcomes. However, current methods for cancer detection have limitations, such as low sensitivity and specificity. Research Aim: The aim of this study was to develop an optical sensor for cancer detection using elastic light scattering spectroscopy (ELSS). ELSS is a noninvasive optical technique that can be used to characterize the size and concentration of particles in a solution. Methodology: An optical probe was fabricated with a 100-μm-diameter core and a 132-μm centre-to-centre separation. The probe was used to measure the ELSS spectra of polystyrene spheres with diameters of 2, 0.8, and 0.413 μm. The spectra were then analysed to determine the size and concentration of the spheres. Findings: The results showed that the optical probe was able to differentiate between the three different sizes of polystyrene spheres. The probe was also able to detect the presence of polystyrene spheres in suspension concentrations as low as 0.01%. Theoretical Importance: The results of this study demonstrate the potential of ELSS for cancer detection. ELSS is a noninvasive technique that can be used to characterize the size and concentration of cells in a tissue sample. This information can be used to identify cancer cells and assess the stage of the disease. Data Collection: The data for this study were collected by measuring the ELSS spectra of polystyrene spheres with different diameters. The spectra were collected using a spectrometer and a computer. Analysis Procedures: The ELSS spectra were analysed using a software program to determine the size and concentration of the spheres. The software program used a mathematical algorithm to fit the spectra to a theoretical model. Question Addressed: The question addressed by this study was whether ELSS could be used to detect cancer cells. The results of the study showed that ELSS could be used to differentiate between different sizes of cells, suggesting that it could be used to detect cancer cells. Conclusion: The findings of this research show the utility of ELSS in the early identification of cancer. ELSS is a noninvasive method for characterizing the number and size of cells in a tissue sample. To determine cancer cells and determine the disease's stage, this information can be employed. Further research is needed to evaluate the clinical performance of ELSS for cancer detection.Keywords: elastic light scattering spectroscopy, polystyrene spheres in suspension, optical probe, fibre optics
Procedia PDF Downloads 82550 Visualization of Chinese Genealogies with Digital Technology: A Case of Genealogy of Wu Clan in the Village of Gaoqian
Authors: Huiling Feng, Jihong Liang, Xiaodong Gong, Yongjun Xu
Abstract:
Recording history is a tradition in ancient China. A record of a dynasty makes a dynastic history; a record of a locality makes a chorography, and a record of a clan makes a genealogy – the three combined together depicts a complete national history of China both macroscopically and microscopically, with genealogy serving as the foundation. Genealogy in ancient China traces back to a family tree or pedigrees in the early and medieval historical times. After Song Dynasty, the civilian society gradually emerged, and the Emperor had to allow people from the same clan to live together and hold the ancestor worship activities, thence compilation of genealogy became popular in the society. Since then, genealogies, regarded as important as ancestor and religious temples in a traditional villages even today, have played a primary role in identification of a clan and maintain local social order. Chinese genealogies are rich in their documentary materials. Take the Genealogy of Wu Clan in Gaoqian as an example. Gaoqian is a small village in Xianju County of Zhejiang Province. The Genealogy of Wu Clan in Gaoqian is composed of a whole set of materials from Foreword to Family Trees, Family Rules, Family Rituals, Family Graces and Glories, Ode to An ancestor’s Portrait, Manual for the Ancestor Temple, documents for great men in the clan, works written by learned men in the clan, the contracts concerning landed property, even notes on tombs and so on. Literally speaking, the genealogy, with detailed information from every aspect recorded in stylistic rules, is indeed the carrier of the entire culture of a clan. However, due to their scarcity in number and difficulties in reading, genealogies seldom fall into the horizons of common people. This paper, focusing on the case of the Genealogy of Wu Clan in the Village of Gaoqian, intends to reproduce a digital Genealogy by use of ICTs, through an in-depth interpretation of the literature and field investigation in Gaoqian Village. Based on this, the paper goes further to explore the general methods in transferring physical genealogies to digital ones and ways in visualizing the clanism culture embedded in the genealogies with a combination of digital technologies such as software in family trees, multimedia narratives, animation design, GIS application and e-book creators.Keywords: clanism culture, multimedia narratives, genealogy of Wu Clan, GIS
Procedia PDF Downloads 221549 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.Keywords: classification, CRISP-DM, machine learning, predictive quality, regression
Procedia PDF Downloads 144548 Evaluation of the Trauma System in a District Hospital Setting in Ireland
Authors: Ahmeda Ali, Mary Codd, Susan Brundage
Abstract:
Importance: This research focuses on devising and improving Health Service Executive (HSE) policy and legislation and therefore improving patient trauma care and outcomes in Ireland. Objectives: The study measures components of the Trauma System in the district hospital setting of the Cavan/Monaghan Hospital Group (CMHG), HSE, Ireland, and uses the collected data to identify the strengths and weaknesses of the CMHG Trauma System organisation, to include governance, injury data, prevention and quality improvement, scene care and facility-based care, and rehabilitation. The information will be made available to local policy makers to provide objective situational analysis to assist in future trauma service planning and service provision. Design, setting and participants: From 28 April to May 28, 2016 a cross-sectional survey using World Health Organisation (WHO) Trauma System Assessment Tool (TSAT) was conducted among healthcare professionals directly involved in the level III trauma system of CMHG. Main outcomes: Identification of the strengths and weaknesses of the Trauma System of CMHG. Results: The participants who reported inadequate funding for pre hospital (62.3%) and facility based trauma care at CMHG (52.5%) were high. Thirty four (55.7%) respondents reported that a national trauma registry (TARN) exists but electronic health records are still not used in trauma care. Twenty one respondents (34.4%) reported that there are system wide protocols for determining patient destination and adequate, comprehensive legislation governing the use of ambulances was enforced, however, there is a lack of a reliable advisory service. Over 40% of the respondents reported uncertainty of the injury prevention programmes available in Ireland; as well as the allocated government funding for injury and violence prevention. Conclusions: The results of this study contributed to a comprehensive assessment of the trauma system organisation. The major findings of the study identified three fundamental areas: the inadequate funding at CMHG, the QI techniques and corrective strategies used, and the unfamiliarity of existing prevention strategies. The findings direct the need for further research to guide future development of the trauma system at CMHG (and in Ireland as a whole) in order to maximise best practice and to improve functional and life outcomes.Keywords: trauma, education, management, system
Procedia PDF Downloads 243547 A Virtual Set-Up to Evaluate Augmented Reality Effect on Simulated Driving
Authors: Alicia Yanadira Nava Fuentes, Ilse Cervantes Camacho, Amadeo José Argüelles Cruz, Ana María Balboa Verduzco
Abstract:
Augmented reality promises being present in future driving, with its immersive technology let to show directions and maps to identify important places indicating with graphic elements when the car driver requires the information. On the other side, driving is considered a multitasking activity and, for some people, a complex activity where different situations commonly occur that require the immediate attention of the car driver to make decisions that contribute to avoid accidents; therefore, the main aim of the project is the instrumentation of a platform with biometric sensors that allows evaluating the performance in driving vehicles with the influence of augmented reality devices to detect the level of attention in drivers, since it is important to know the effect that it produces. In this study, the physiological sensors EPOC X (EEG), ECG06 PRO and EMG Myoware are joined in the driving test platform with a Logitech G29 steering wheel and the simulation software City Car Driving in which the level of traffic can be controlled, as well as the number of pedestrians that exist within the simulation obtaining a driver interaction in real mode and through a MSP430 microcontroller achieves the acquisition of data for storage. The sensors bring a continuous analog signal in time that needs signal conditioning, at this point, a signal amplifier is incorporated due to the acquired signals having a sensitive range of 1.25 mm/mV, also filtering that consists in eliminating the frequency bands of the signal in order to be interpretative and without noise to convert it from an analog signal into a digital signal to analyze the physiological signals of the drivers, these values are stored in a database. Based on this compilation, we work on the extraction of signal features and implement K-NN (k-nearest neighbor) classification methods and decision trees (unsupervised learning) that enable the study of data for the identification of patterns and determine by classification methods different effects of augmented reality on drivers. The expected results of this project include are a test platform instrumented with biometric sensors for data acquisition during driving and a database with the required variables to determine the effect caused by augmented reality on people in simulated driving.Keywords: augmented reality, driving, physiological signals, test platform
Procedia PDF Downloads 141546 Analyzing the Risk Based Approach in General Data Protection Regulation: Basic Challenges Connected with Adapting the Regulation
Authors: Natalia Kalinowska
Abstract:
The adoption of the General Data Protection Regulation, (GDPR) finished the four-year work of the European Commission in this area in the European Union. Considering far-reaching changes, which will be applied by GDPR, the European legislator envisaged two-year transitional period. Member states and companies have to prepare for a new regulation until 25 of May 2018. The idea, which becomes a new look at an attitude to data protection in the European Union is risk-based approach. So far, as a result of implementation of Directive 95/46/WE, in many European countries (including Poland) there have been adopted very particular regulations, specifying technical and organisational security measures e.g. Polish implementing rules indicate even how long password should be. According to the new approach from May 2018, controllers and processors will be obliged to apply security measures adequate to level of risk associated with specific data processing. The risk in GDPR should be interpreted as the likelihood of a breach of the rights and freedoms of the data subject. According to Recital 76, the likelihood and severity of the risk to the rights and freedoms of the data subject should be determined by reference to the nature, scope, context and purposes of the processing. GDPR does not indicate security measures which should be applied – in recitals there are only examples such as anonymization or encryption. It depends on a controller’s decision what type of security measures controller considered as sufficient and he will be responsible if these measures are not sufficient or if his identification of risk level is incorrect. Data protection regulation indicates few levels of risk. Recital 76 indicates risk and high risk, but some lawyers think, that there is one more category – low risk/now risk. Low risk/now risk data processing is a situation when it is unlikely to result in a risk to the rights and freedoms of natural persons. GDPR mentions types of data processing when a controller does not have to evaluate level of risk because it has been classified as „high risk” processing e.g. processing on a large scale of special categories of data, processing with using new technologies. The methodology will include analysis of legal regulations e.g. GDPR, the Polish Act on the Protection of personal data. Moreover: ICO Guidelines and articles concerning risk based approach in GDPR. The main conclusion is that an appropriate risk assessment is a key to keeping data safe and avoiding financial penalties. On the one hand, this approach seems to be more equitable, not only for controllers or processors but also for data subjects, but on the other hand, it increases controllers’ uncertainties in the assessment which could have a direct impact on incorrect data protection and potential responsibility for infringement of regulation.Keywords: general data protection regulation, personal data protection, privacy protection, risk based approach
Procedia PDF Downloads 252545 Stems of Prunus avium: An Unexplored By-product with Great Bioactive Potential
Authors: Luís R. Silva, Fábio Jesus, Catarina Bento, Ana C. Gonçalves
Abstract:
Over the last few years, the traditional medicine has gained ground at nutritional and pharmacological level. The natural products and their derivatives have great importance in several drugs used in modern therapeutics. Plant-based systems continue to play an essential role in primary healthcare. Additionally, the utilization of their plant parts, such as leaves, stems and flowers as nutraceutical and pharmaceutical products, can add a high value in the natural products market, not just by the nutritional value due to the significant levels of phytochemicals, but also by to the high benefit for the producers and manufacturers business. Stems of Prunus avium L. are a byproduct resulting from the processing of cherry, and have been consumed over the years as infusions and decoctions due to its bioactive properties, being used as sedative, diuretic and draining, to relief of renal stones, edema and hypertension. In this work, we prepared a hydroethanolic and infusion extracts from stems of P. avium collected in Fundão Region (Portugal), and evaluate the phenolic profile by LC/DAD, antioxidant capacity, α-glucosidase inhibitory activity and protection of human erythrocytes against oxidative damage. The LC-DAD analysis allowed to the identification of 19 phenolic compounds, catechin and 3-O-caffolquinic acid were the main ones. In a general way, hydroethanolic extract proved to be more active than infusion. This extract had the best antioxidant activity against DPPH• (IC50=22.37 ± 0.28 µg/mL) and superoxide radical (IC50=13.93 ± 0.30 µg/mL). Furthermore, it was the most active concerning inhibition of hemoglobin oxidation (IC50=13.73 ± 0.67 µg/mL), hemolysis (IC50=1.49 ± 0.18 µg/mL) and lipid peroxidation (IC50=26.20 ± 0.38 µg/mL) on human erythrocytes. On the other hand, infusion revealed to be more efficient towards α-glucosidase inhibitory activity (IC50=3.18 ± 0.23 µg/mL) and against nitric oxide radical (IC50=99.99 ± 1.89 µg/mL). The Sweet cherry sector is very important in Fundão Region (Portugal), and taking profit from the great wastes produced during processing of the cherry to produce added-value products, such as food supplements cannot be ignored. Our results demonstrate that P. avium stems possesses remarkable antioxidant and free radical scavenging properties. It is therefore, suggest, that P. avium stems can be used as a natural antioxidant with high potential to prevent or slow the progress of human diseases mediated by oxidative stress.Keywords: stems, Prunus avium, phenolic compounds, biological potential
Procedia PDF Downloads 297544 Screening for Non-hallucinogenic Neuroplastogens as Drug Candidates for the Treatment of Anxiety, Depression, and Posttraumatic Stress Disorder
Authors: Jillian M. Hagel, Joseph E. Tucker, Peter J. Facchini
Abstract:
With the aim of establishing a holistic approach for the treatment of central nervous system (CNS) disorders, we are pursuing a drug development program rapidly progressing through discovery and characterization phases. The drug candidates identified in this program are referred to as neuroplastogens owing to their ability to mediate neuroplasticity, which can be beneficial to patients suffering from anxiety, depression, or posttraumatic stress disorder. These and other related neuropsychiatric conditions are associated with the onset of neuronal atrophy, which is defined as a reduction in the number and/or productivity of neurons. The stimulation of neuroplasticity results in an increase in the connectivity between neurons and promotes the restoration of healthy brain function. We have synthesized a substantial catalogue of proprietary indolethylamine derivatives based on the general structures of serotonin (5-hydroxytryptamine) and psychedelic molecules such as N,N-dimethyltryptamine (DMT) and psilocin (4-hydroxy-DMT) that function as neuroplastogens. A primary objective in our screening protocol is the identification of derivatives associated with a significant reduction in hallucination, which will allow administration of the drug at a dose that induces neuroplasticity and triggers other efficacious outcomes in the treatment of targeted CNS disorders but which does not cause a psychedelic response in the patient. Both neuroplasticity and hallucination are associated with engagement of the 5HT2A receptor, requiring drug candidates differentially coupled to these two outcomes at a molecular level. We use novel and proprietary artificial intelligence algorithms to predict the mode of binding to the 5HT2A receptor, which has been shown to correlate with the hallucinogenic response. Hallucination is tested using the mouse head-twitch response model, whereas mouse marble-burying and sucrose preference assays are used to evaluate anxiolytic and anti-depressive potential. Neuroplasticity is assays using dendritic outgrowth assays and cell-based ELISA analysis. Pharmacokinetics and additional receptor-binding analyses also contribute the selection of lead candidates. A summary of the program is presented.Keywords: neuroplastogen, non-hallucinogenic, drug development, anxiety, depression, PTSD, indolethylamine derivatives, psychedelic-inspired, 5-HT2A receptor, computational chemistry, head-twitch response behavioural model, neurite outgrowth assay
Procedia PDF Downloads 138543 Relationship of Macro-Concepts in Educational Technologies
Authors: L. R. Valencia Pérez, A. Morita Alexander, Peña A. Juan Manuel, A. Lamadrid Álvarez
Abstract:
This research shows the reflection and identification of explanatory variables and their relationships between different variables that are involved with educational technology, all of them encompassed in macro-concepts which are: cognitive inequality, economy, food and language; These will give the guideline to have a more detailed knowledge of educational systems, the communication and equipment, the physical space and the teachers; All of them interacting with each other give rise to what is called educational technology management. These elements contribute to have a very specific knowledge of the equipment of communications, networks and computer equipment, systems and content repositories. This is intended to establish the importance of knowing a global environment in the transfer of knowledge in poor countries, so that it does not diminish the capacity to be authentic and preserve their cultures, their languages or dialects, their hierarchies and real needs; In short, to respect the customs of different towns, villages or cities that are intended to be reached through the use of internationally agreed professional educational technologies. The methodology used in this research is the analytical - descriptive, which allows to explain each of the variables, which in our opinion must be taken into account, in order to achieve an optimal incorporation of the educational technology in a model that gives results in a medium term. The idea is that in an encompassing way the concepts will be integrated to others with greater coverage until reaching macro concepts that are of national coverage in the countries and that are elements of conciliation in the different federal and international reforms. At the center of the model is the educational technology which is directly related to the concepts that are contained in factors such as the educational system, communication and equipment, spaces and teachers, which are globally immersed in macro concepts Cognitive inequality, economics, food and language. One of the major contributions of this article is to leave this idea under an algorithm that allows to be as unbiased as possible when evaluating this indicator, since other indicators that are to be taken from international preference entities like the OECD in the area of education systems studied, so that they are not influenced by particular political or interest pressures. This work opens the way for a relationship between involved entities, both conceptual, procedural and human activity, to clearly identify the convergence of their impact on the problem of education and how the relationship can contribute to an improvement, but also shows possibilities of being able to reach a comprehensive education reform for all.Keywords: relationships macro-concepts, cognitive inequality, economics, alimentation and language
Procedia PDF Downloads 199542 Tagging a corpus of Media Interviews with Diplomats: Challenges and Solutions
Authors: Roberta Facchinetti, Sara Corrizzato, Silvia Cavalieri
Abstract:
Increasing interconnection between data digitalization and linguistic investigation has given rise to unprecedented potentialities and challenges for corpus linguists, who need to master IT tools for data analysis and text processing, as well as to develop techniques for efficient and reliable annotation in specific mark-up languages that encode documents in a format that is both human and machine-readable. In the present paper, the challenges emerging from the compilation of a linguistic corpus will be taken into consideration, focusing on the English language in particular. To do so, the case study of the InterDiplo corpus will be illustrated. The corpus, currently under development at the University of Verona (Italy), represents a novelty in terms both of the data included and of the tag set used for its annotation. The corpus covers media interviews and debates with diplomats and international operators conversing in English with journalists who do not share the same lingua-cultural background as their interviewees. To date, this appears to be the first tagged corpus of international institutional spoken discourse and will be an important database not only for linguists interested in corpus analysis but also for experts operating in international relations. In the present paper, special attention will be dedicated to the structural mark-up, parts of speech annotation, and tagging of discursive traits, that are the innovational parts of the project being the result of a thorough study to find the best solution to suit the analytical needs of the data. Several aspects will be addressed, with special attention to the tagging of the speakers’ identity, the communicative events, and anthropophagic. Prominence will be given to the annotation of question/answer exchanges to investigate the interlocutors’ choices and how such choices impact communication. Indeed, the automated identification of questions, in relation to the expected answers, is functional to understand how interviewers elicit information as well as how interviewees provide their answers to fulfill their respective communicative aims. A detailed description of the aforementioned elements will be given using the InterDiplo-Covid19 pilot corpus. The data yielded by our preliminary analysis of the data will highlight the viable solutions found in the construction of the corpus in terms of XML conversion, metadata definition, tagging system, and discursive-pragmatic annotation to be included via Oxygen.Keywords: spoken corpus, diplomats’ interviews, tagging system, discursive-pragmatic annotation, english linguistics
Procedia PDF Downloads 185541 Multiperson Drone Control with Seamless Pilot Switching Using Onboard Camera and Openpose Real-Time Keypoint Detection
Authors: Evan Lowhorn, Rocio Alba-Flores
Abstract:
Traditional classification Convolutional Neural Networks (CNN) attempt to classify an image in its entirety. This becomes problematic when trying to perform classification with a drone’s camera in real-time due to unpredictable backgrounds. Object detectors with bounding boxes can be used to isolate individuals and other items, but the original backgrounds remain within these boxes. These basic detectors have been regularly used to determine what type of object an item is, such as “person” or “dog.” Recent advancement in computer vision, particularly with human imaging, is keypoint detection. Human keypoint detection goes beyond bounding boxes to fully isolate humans and plot points, or Regions of Interest (ROI), on their bodies within an image. ROIs can include shoulders, elbows, knees, heads, etc. These points can then be related to each other and used in deep learning methods such as pose estimation. For drone control based on human motions, poses, or signals using the onboard camera, it is important to have a simple method for pilot identification among multiple individuals while also giving the pilot fine control options for the drone. To achieve this, the OpenPose keypoint detection network was used with body and hand keypoint detection enabled. OpenPose supports the ability to combine multiple keypoint detection methods in real-time with a single network. Body keypoint detection allows simple poses to act as the pilot identifier. The hand keypoint detection with ROIs for each finger can then offer a greater variety of signal options for the pilot once identified. For this work, the individual must raise their non-control arm to be identified as the operator and send commands with the hand on their other arm. The drone ignores all other individuals in the onboard camera feed until the current operator lowers their non-control arm. When another individual wish to operate the drone, they simply raise their arm once the current operator relinquishes control, and then they can begin controlling the drone with their other hand. This is all performed mid-flight with no landing or script editing required. When using a desktop with a discrete NVIDIA GPU, the drone’s 2.4 GHz Wi-Fi connection combined with OpenPose restrictions to only body and hand allows this control method to perform as intended while maintaining the responsiveness required for practical use.Keywords: computer vision, drone control, keypoint detection, openpose
Procedia PDF Downloads 184540 The Survey of Relationship between Health Literacy and Knowledge of Heart Failure with Rehospitalization in Patients with Heart Failure Admitted to Heart Failure Clinic
Authors: Jaleh Mohammad Aliha, Rezvan Razazi, Nasim Naderi
Abstract:
Introduction: Despite the progress in new effective drugs in the treatment of heart failure, the disease still accompanied with frequent hospitalization, impaired quality of life, early mortality and significant economic burden. Patients with chronic disease and consequently patients with heart failure need the knowledge and optimal health literacy to improve the quality of life and minimize the rate of rehopitalizatio. So, considering to importance of knowledge and health literacy in this patients as well as contradictory literature, this study conducted to investigate the relationship between health literacy and Knowledge of heart failure with rehospitalization in patients with heart failure admitted to heart failure clinic in Rajai Heart center in 1394. Methods: The cross-sectional method with convenience sampling method was used in this study. After obtaining the necessary permissions from the ethics committee and the Shahid Rajai Heart center, 238 patients who were older than 18 years and had ejection fraction 35% or less with the ability to read and write and lack of psychiatric, neurological and cognitive disorders and signed the informed consent were recruited. Data collection were perfomed through demographic data questionnaire, short standard health literacy questionnaire 'Short-TOFHLA-16' and Vanderwall (2005) knowledge of heart failure questionnaire. Reliability was assessed by internal consistency method and Cronbach's alpha for both questionnaires was more than 0.7. Then data were analysed by SPSS-20 with descriptive statistic and analytical statistic such as T-test, Chi-square and ANOVA. Results: The majority of patients were male (66%), married (80%) and had age between 50 to 70 years old (42%). The majority of studied men and women have good health literacy and About half of them have adequate knowledge about heart failure. Fisher's exact test showed that there was a significant statistical correlation between health literacy and knowlegh about heart failure. In other words, higher health literacy associated with more knowledge about their condition. Also findings showed that there was no significant statistical correlation between health literacy and knowledge about heart failure and frequency of CCU and emergency admissions. Conclusion: The study results showed that the higher health literacy, associated with the greater knowledge about heart failure and patients' perception about caring recommendations and disease outcomes. Therefore, the knowledge about heart failure and factors which related to severity of the disease, is the important issue to problem identification and treatment and reduction of rehospitalization.Keywords: health literacy, heart failure, knowlegde, rehospitalization
Procedia PDF Downloads 401539 Virtual Metrology for Copper Clad Laminate Manufacturing
Authors: Misuk Kim, Seokho Kang, Jehyuk Lee, Hyunchang Cho, Sungzoon Cho
Abstract:
In semiconductor manufacturing, virtual metrology (VM) refers to methods to predict properties of a wafer based on machine parameters and sensor data of the production equipment, without performing the (costly) physical measurement of the wafer properties (Wikipedia). Additional benefits include avoidance of human bias and identification of important factors affecting the quality of the process which allow improving the process quality in the future. It is however rare to find VM applied to other areas of manufacturing. In this work, we propose to use VM to copper clad laminate (CCL) manufacturing. CCL is a core element of a printed circuit board (PCB) which is used in smartphones, tablets, digital cameras, and laptop computers. The manufacturing of CCL consists of three processes: Treating, lay-up, and pressing. Treating, the most important process among the three, puts resin on glass cloth, heat up in a drying oven, then produces prepreg for lay-up process. In this process, three important quality factors are inspected: Treated weight (T/W), Minimum Viscosity (M/V), and Gel Time (G/T). They are manually inspected, incurring heavy cost in terms of time and money, which makes it a good candidate for VM application. We developed prediction models of the three quality factors T/W, M/V, and G/T, respectively, with process variables, raw material, and environment variables. The actual process data was obtained from a CCL manufacturer. A variety of variable selection methods and learning algorithms were employed to find the best prediction model. We obtained prediction models of M/V and G/T with a high enough accuracy. They also provided us with information on “important” predictor variables, some of which the process engineers had been already aware and the rest of which they had not. They were quite excited to find new insights that the model revealed and set out to do further analysis on them to gain process control implications. T/W did not turn out to be possible to predict with a reasonable accuracy with given factors. The very fact indicates that the factors currently monitored may not affect T/W, thus an effort has to be made to find other factors which are not currently monitored in order to understand the process better and improve the quality of it. In conclusion, VM application to CCL’s treating process was quite successful. The newly built quality prediction model allowed one to reduce the cost associated with actual metrology as well as reveal some insights on the factors affecting the important quality factors and on the level of our less than perfect understanding of the treating process.Keywords: copper clad laminate, predictive modeling, quality control, virtual metrology
Procedia PDF Downloads 350538 Short Association Bundle Atlas for Lateralization Studies from dMRI Data
Authors: C. Román, M. Guevara, P. Salas, D. Duclap, J. Houenou, C. Poupon, J. F. Mangin, P. Guevara
Abstract:
Diffusion Magnetic Resonance Imaging (dMRI) allows the non-invasive study of human brain white matter. From diffusion data, it is possible to reconstruct fiber trajectories using tractography algorithms. Our previous work consists in an automatic method for the identification of short association bundles of the superficial white matter (SWM), based on a whole brain inter-subject hierarchical clustering applied to a HARDI database. The method finds representative clusters of similar fibers, belonging to a group of subjects, according to a distance measure between fibers, using a non-linear registration (DTI-TK). The algorithm performs an automatic labeling based on the anatomy, defined by a cortex mesh parcelated with FreeSurfer software. The clustering was applied to two independent groups of 37 subjects. The clusters resulting from both groups were compared using a restrictive threshold of mean distance between each pair of bundles from different groups, in order to keep reproducible connections. In the left hemisphere, 48 reproducible bundles were found, while 43 bundles where found in the right hemisphere. An inter-hemispheric bundle correspondence was then applied. The symmetric horizontal reflection of the right bundles was calculated, in order to obtain the position of them in the left hemisphere. Next, the intersection between similar bundles was calculated. The pairs of bundles with a fiber intersection percentage higher than 50% were considered similar. The similar bundles between both hemispheres were fused and symmetrized. We obtained 30 common bundles between hemispheres. An atlas was created with the resulting bundles and used to segment 78 new subjects from another HARDI database, using a distance threshold between 6-8 mm according to the bundle length. Finally, a laterality index was calculated based on the bundle volume. Seven bundles of the atlas presented right laterality (IP_SP_1i, LO_LO_1i, Op_Tr_0i, PoC_PoC_0i, PoC_PreC_2i, PreC_SM_0i, y RoMF_RoMF_0i) and one presented left laterality (IP_SP_2i), there is no tendency of lateralization according to the brain region. Many factors can affect the results, like tractography artifacts, subject registration, and bundle segmentation. Further studies are necessary in order to establish the influence of these factors and evaluate SWM laterality.Keywords: dMRI, hierarchical clustering, lateralization index, tractography
Procedia PDF Downloads 331537 The Incidence of Maxillary Canine Ankylosis: A Single-Centre Analysis of 206 Canines Following Surgical Exposure and Orthodontic Alignment
Authors: Sidra Suleman, Maliha Suleman, Jinesh Shah
Abstract:
Maxillary canines play a crucial role in occlusion and aesthetics. Successful management of impacted canines requires early identification and intervention to prevent complications such as resorption of adjacent teeth and cystic changes. Although removal of the deciduous canine can encourage normal eruption of its successor, this is not always successful. Some patients may require surgical exposure and bonding of a gold chain to mobilise and align the canine, which can take up to 3 years. As this procedure has various risks, patients need to be appropriately consented to. Failure of such treatment commonly occurs due to inadequate anchorage or failure of the gold chain attachment, but in some cases, this is due to ankylosis. Aim: The aim of this study was to determine the incidence of ankylosis of unerupted maxillary ectopic canines following surgical exposure and orthodontic alignment at the Maxillofacial and Orthodontic Department, Royal Stoke University Hospital (RSUH), United Kingdom. Methodology: Patients treated from January 1, 2017, to December 31, 2019, were retrospectively studied. Electronic records with post-treatment follow-up at 3-6 months and 12-15 months were extracted and analysed. Patients were excluded based on three criteria, non-compliance with orthodontic treatment post-surgery, presence of canine transposition, and external orthodontic treatment. Sample: Overall, 159 suitable patients were selected from the 171 patients identified. Surgical exposure and gold chain bonding was carried out for a total of 206 maxillary canines, with the pattern of impaction being 159 (77.2 %) palatal, 46 (22.3%) buccal, and 1 (0.49%) in line of the arch. The sample consisted of 57 (35.8%) males and 102 (64.2%) females between the age range of 10 to 32 years, with the mean age being 15 years. The procedures were carried out under general anaesthesia for all but three patients, with two cases being repeats. Closed exposure was carried out for 189 (91.7%) canines. Results: The incidence of ankylosis from this study was 0.97%. In total, two patients had upper left canine ankylosis, which was identified at their 12-15 months orthodontic follow-up. Both patients were males, one having closed exposure at age 15 and the other having open exposure at age 19. Conclusions: Although this data shows that there is a low risk of ankylosis (0.97%), it highlights the difficulty in predicting which patients may be affected, and thus, a thorough pre-treatment assessment and careful observation during treatment is necessary. Future studies involving larger cohorts are warranted to further analyse factors affecting outcomes.Keywords: ankylosis, ectopic, maxillary canines, orthodontics
Procedia PDF Downloads 209536 Patients in Opioid Maintenance Programs: Psychological Features that Predict Abstinence
Authors: Janaina Pereira, Barbara Gonzalez, Valentina Chitas, Teresa Molina
Abstract:
Intro: The positive impact of opioid maintenance programs on the health of heroin addicts, and on public health in general, has been widely recognized, namely on the prevalence reduction of infectious diseases as HIV, and on the social reintegration of this population. Nevertheless, a part of patients in these programs cannot remain heroin abstinent, or has relapses, during the treatment. Method: Thus, this cross-sectional research aims at analyzing the relation between a set of psychological and psychosocial variables, which have been associated with the onset of heroin use, and assess if they are also associated with absence of abstinence in participants in an opioid maintenance program. A total of 62 patients, aged between 26 and 58 years old (M= 40.87, DP= 7.39) with a time in opioid maintenance program between 1 and 10 years (M= 5.42, DP= 3.05), 77.4% male and 22.6% female, participated in this research. To assess the criterion variable (heroin use) we used the mean value of positive results in urine tests during the participation in the program, weighted according to the number of months in program. The predictor variables were the coping strategies, the dispositional sensation seeking, and the existence of Posttraumatic stress disorder (PTSD). Results: The results showed that only 33.87% of the patients were totally abstinent of heroin use since the beginning of the program, and the absence of abstinence, as the number of positive heroin tests, was primarily predicted by less proactive coping, and secondarily by a higher level of sensation seeking. 16.13% of the sample fulfilled diagnosis criteria for PTSD, and 67.74 % had at least one traumatic experience throughout their lives. The total of PTSD symptoms had a positive correlation with the number of physical health problems, and with the lack of professional occupation. These results have several implications for the clinical practice in this field, and we suggest the promotion of proactive coping strategies should integrate these opioid maintenance programs, as they represent the tendency to face future events as challenges and opportunities, being positively related to positive results on several fields. The early identification of PTSD in the participants, before entering the opioid maintenance programs, would be important as it is related to negative features that hinder social reintegration, Finally, to identify individuals with a sensation seeking profile would be relevant, not only because they face a higher risk of relapse, but also because the therapeutical approaches should not ignore this dispositional feature in the alternatives they propose to the patients.Keywords: opioid maintenance programs, proactive coping, PTSD, sensation seeking
Procedia PDF Downloads 128535 Developing a Framework for Assessing and Fostering the Sustainability of Manufacturing Companies
Authors: Ilaria Barletta, Mahesh Mani, Björn Johansson
Abstract:
The concept of sustainability encompasses economic, environmental, social and institutional considerations. Sustainable manufacturing (SM) is, therefore, a multi-faceted concept. It broadly implies the development and implementation of technologies, projects and initiatives that are concerned with the life cycle of products and services, and are able to bring positive impacts to the environment, company stakeholders and profitability. Because of this, achieving SM-related goals requires a holistic, life-cycle-thinking approach from manufacturing companies. Further, such an approach must rely on a logic of continuous improvement and ease of implementation in order to be effective. Currently, there exists in the academic literature no comprehensively structured frameworks that support manufacturing companies in the identification of the issues and the capabilities that can either hinder or foster sustainability. This scarcity of support extends to difficulties in obtaining quantifiable measurements in order to objectively evaluate solutions and programs and identify improvement areas within SM for standards conformance. To bridge this gap, this paper proposes the concept of a framework for assessing and continuously improving the sustainability of manufacturing companies. The framework addresses strategies and projects for SM and operates in three sequential phases: analysis of the issues, design of solutions and continuous improvement. A set of interviews, observations and questionnaires are the research methods to be used for the implementation of the framework. Different decision-support methods - either already-existing or novel ones - can be 'plugged into' each of the phases. These methods can assess anything from business capabilities to process maturity. In particular, the authors are working on the development of a sustainable manufacturing maturity model (SMMM) as decision support within the phase of 'continuous improvement'. The SMMM, inspired by previous maturity models, is made up of four maturity levels stemming from 'non-existing' to 'thriving'. Aggregate findings from the use of the framework should ultimately reveal to managers and CEOs the roadmap for achieving SM goals and identify the maturity of their companies’ processes and capabilities. Two cases from two manufacturing companies in Australia are currently being employed to develop and test the framework. The use of this framework will bring two main benefits: enable visual, intuitive internal sustainability benchmarking and raise awareness of improvement areas that lead companies towards an increasingly developed SM.Keywords: life cycle management, continuous improvement, maturity model, sustainable manufacturing
Procedia PDF Downloads 266534 Selection of Suitable Reference Genes for Assessing Endurance Related Traits in a Native Pony Breed of Zanskar at High Altitude
Authors: Prince Vivek, Vijay K. Bharti, Manishi Mukesh, Ankita Sharma, Om Prakash Chaurasia, Bhuvnesh Kumar
Abstract:
High performance of endurance in equid requires adaptive changes involving physio-biochemical, and molecular responses in an attempt to regain homeostasis. We hypothesized that the identification of the suitable reference genes might be considered for assessing of endurance related traits in pony at high altitude and may ensure for individuals struggling to potent endurance trait in ponies at high altitude. A total of 12 mares of ponies, Zanskar breed, were divided into three groups, group-A (without load), group-B, (60 Kg) and group-C (80 Kg) on backpack loads were subjected to a load carry protocol, on a steep climb of 4 km uphill, and of gravel, uneven rocky surface track at an altitude of 3292 m to 3500 m (endpoint). Blood was collected before and immediately after the load carry on sodium heparin anticoagulant, and the peripheral blood mononuclear cell was separated for total RNA isolation and thereafter cDNA synthesis. Real time-PCR reactions were carried out to evaluate the mRNAs expression profile of a panel of putative internal control genes (ICGs), related to different functional classes, namely glyceraldehyde 3-phosphate dehydrogenase (GAPDH), β₂ microglobulin (β₂M), β-actin (ACTB), ribosomal protein 18 (RS18), hypoxanthine-guanine phosophoribosyltransferase (HPRT), ubiquitin B (UBB), ribosomal protein L32 (RPL32), transferrin receptor protein (TFRC), succinate dehydrogenase complex subunit A (SDHA) for normalizing the real-time quantitative polymerase chain reaction (qPCR) data of native pony’s. Three different algorithms, geNorm, NormFinder, and BestKeeper software, were used to evaluate the stability of reference genes. The result showed that GAPDH was best stable gene and stability value for the best combination of two genes was observed TFRC and β₂M. In conclusion, the geometric mean of GAPDH, TFRC and β₂M might be used for accurate normalization of transcriptional data for assessing endurance related traits in Zanskar ponies during load carrying.Keywords: endurance exercise, ubiquitin B (UBB), β₂ microglobulin (β₂M), high altitude, Zanskar ponies, reference gene
Procedia PDF Downloads 131533 Tumour-Associated Tissue Eosinophilia as a Prognosticator in Oral Squamous Cell Carcinoma
Authors: Karen Boaz, C. R. Charan
Abstract:
Background: The infiltration of tumour stroma by eosinophils, Tumor-Associated Tissue Eosinophilia (TATE), is known to modulate the progression of Oral Squamous Cell Carcinoma (OSCC). Eosinophils have direct tumoricidal activity by release of cytotoxic proteins and indirectly they enhance permeability into tumor cells enabling penetration of tumoricidal cytokines. Also, eosinophils may promote tumor angiogenesis by production of several angiogenic factors. Identification of eosinophils in the inflammatory stroma has been proven to be an important prognosticator in cancers of mouth, oesophagus, larynx, pharynx, breast, lung, and intestine. Therefore, the study aimed to correlate TATE with clinical and histopathological variables, and blood eosinophil count to assess the role of TATE as a prognosticator in Oral Squamous Cell Carcinoma (OSCC). Methods: Seventy two biopsy-proven cases of OSCC formed the study cohort. Blood eosinophil counts and TNM stage were obtained from the medical records. Tissue sections (5µm thick) were stained with Haematoxylin and Eosin. The eosinophils were quantified at invasive tumour front (ITF) in 10HPF (40x magnification) with an ocular grid. Bryne’s grading of ITF was also performed. A subset of thirty cases was also assessed for association of TATE with recurrence, involvement of lymph nodes and surgical margins. Results: 1) No statistically significant correlation was found between TATE and TNM stage, blood eosinophil counts and most parameters of Bryne’s grading system. 2) Statistically significant relation of intense degree of TATE was associated with the absence of distant metastasis, increased lympho-plasmacytic response and increased survival (diseasefree and overall) of OSCC patients. 3) In the subset of 30 cases, tissue eosinophil counts were higher in cases with lymph node involvement, decreased survival, without margin involvement and in cases that did not recur. Conclusion: While the role of eosinophils in mediating immune responses seems ambiguous as eosinophils support cell-mediated tumour immunity in early stages while inhibiting the same in advanced stages, TATE may be used as a surrogate marker for determination of prognosis in oral squamous cell carcinoma.Keywords: tumour-associated tissue eosinophilia, oral squamous cell carcinoma, prognosticator, tumoral immunity
Procedia PDF Downloads 250532 Optimizing Residential Housing Renovation Strategies at Territorial Scale: A Data Driven Approach and Insights from the French Context
Authors: Rit M., Girard R., Villot J., Thorel M.
Abstract:
In a scenario of extensive residential housing renovation, stakeholders need models that support decision-making through a deep understanding of the existing building stock and accurate energy demand simulations. To address this need, we have modified an optimization model using open data that enables the study of renovation strategies at both territorial and national scales. This approach provides (1) a definition of a strategy to simplify decision trees from theoretical combinations, (2) input to decision makers on real-world renovation constraints, (3) more reliable identification of energy-saving measures (changes in technology or behaviour), and (4) discrepancies between currently planned and actually achieved strategies. The main contribution of the studies described in this document is the geographic scale: all residential buildings in the areas of interest were modeled and simulated using national data (geometries and attributes). These buildings were then renovated, when necessary, in accordance with the environmental objectives, taking into account the constraints applicable to each territory (number of renovations per year) or at the national level (renovation of thermal deficiencies (Energy Performance Certificates F&G)). This differs from traditional approaches that focus only on a few buildings or archetypes. This model can also be used to analyze the evolution of a building stock as a whole, as it can take into account both the construction of new buildings and their demolition or sale. Using specific case studies of French territories, this paper highlights a significant discrepancy between the strategies currently advocated by decision-makers and those proposed by our optimization model. This discrepancy is particularly evident in critical metrics such as the relationship between the number of renovations per year and achievable climate targets or the financial support currently available to households and the remaining costs. In addition, users are free to seek optimizations for their building stock across a range of different metrics (e.g., financial, energy, environmental, or life cycle analysis). These results are a clear call to re-evaluate existing renovation strategies and take a more nuanced and customized approach. As the climate crisis moves inexorably forward, harnessing the potential of advanced technologies and data-driven methodologies is imperative.Keywords: residential housing renovation, MILP, energy demand simulations, data-driven methodology
Procedia PDF Downloads 68531 On the Semantics and Pragmatics of 'Be Able To': Modality and Actualisation
Authors: Benoît Leclercq, Ilse Depraetere
Abstract:
The goal of this presentation is to shed new light on the semantics and pragmatics of be able to. It presents the results of a corpus analysis based on data from the BNC (British National Corpus), and discusses these results in light of a specific stance on the semantics-pragmatics interface taking into account recent developments. Be able to is often discussed in relation to can and could, all of which can be used to express ability. Such an onomasiological approach often results in the identification of usage constraints for each expression. In the case of be able to, it is the formal properties of the modal expression (unlike can and could, be able to has non-finite forms) that are in the foreground, and the modal expression is described as the verb that conveys future ability. Be able to is also argued to expressed actualised ability in the past (I was able/could to open the door). This presentation aims to provide a more accurate pragmatic-semantic profile of be able to, based on extensive data analysis and one that is embedded in a very explicit view on the semantics-pragmatics interface. A random sample of 3000 examples (1000 for each modal verb) extracted from the BNC was analysed to account for the following issues. First, the challenge is to identify the exact semantic range of be able to. The results show that, contrary to general assumption, be able to does not only express ability but it shares most of the root meanings usually associated with the possibility modals can and could. The data reveal that what is called opportunity is, in fact, the most frequent meaning of be able to. Second, attention will be given to the notion of actualisation. It is commonly argued that be able to is the preferred form when the residue actualises: (1) The only reason he was able to do that was because of the restriction (BNC, spoken) (2) It is only through my imaginative shuffling of the aces that we are able to stay ahead of the pack. (BNC, written) Although this notion has been studied in detail within formal semantic approaches, empirical data is crucially lacking and it is unclear whether actualisation constitutes a conventional (and distinguishing) property of be able to. The empirical analysis provides solid evidence that actualisation is indeed a conventional feature of the modal. Furthermore, the dataset reveals that be able to expresses actualised 'opportunities' and not actualised 'abilities'. In the final part of this paper, attention will be given to the theoretical implications of the empirical findings, and in particular to the following paradox: how can the same expression encode both modal meaning (non-factual) and actualisation (factual)? It will be argued that this largely depends on one's conception of the semantics-pragmatics interface, and that this need not be an issue when actualisation (unlike modality) is analysed as a generalised conversational implicature and thus is considered part of the conventional pragmatic layer of be able to.Keywords: Actualisation, Modality, Pragmatics, Semantics
Procedia PDF Downloads 131530 Cognitive Translation and Conceptual Wine Tasting Metaphors: A Corpus-Based Research
Authors: Christine Demaecker
Abstract:
Many researchers have underlined the importance of metaphors in specialised language. Their use of specific domains helps us understand the conceptualisations used to communicate new ideas or difficult topics. Within the wide area of specialised discourse, wine tasting is a very specific example because it is almost exclusively metaphoric. Wine tasting metaphors express various conceptualisations. They are not linguistic but rather conceptual, as defined by Lakoff & Johnson. They correspond to the linguistic expression of a mental projection from a well-known or more concrete source domain onto the target domain, which is the taste of wine. But unlike most specialised terminologies, the vocabulary is never clearly defined. When metaphorical terms are listed in dictionaries, their definitions remain vague, unclear, and circular. They cannot be replaced by literal linguistic expressions. This makes it impossible to transfer them into another language with the traditional linguistic translation methods. Qualitative research investigates whether wine tasting metaphors could rather be translated with the cognitive translation process, as well described by Nili Mandelblit (1995). The research is based on a corpus compiled from two high-profile wine guides; the Parker’s Wine Buyer’s Guide and its translation into French and the Guide Hachette des Vins and its translation into English. In this small corpus with a total of 68,826 words, 170 metaphoric expressions have been identified in the original English text and 180 in the original French text. They have been selected with the MIPVU Metaphor Identification Procedure developed at the Vrije Universiteit Amsterdam. The selection demonstrates that both languages use the same set of conceptualisations, which are often combined in wine tasting notes, creating conceptual integrations or blends. The comparison of expressions in the source and target texts also demonstrates the use of the cognitive translation approach. In accordance with the principle of relevance, the translation always uses target language conceptualisations, but compared to the original, the highlighting of the projection is often different. Also, when original metaphors are complex with a combination of conceptualisations, at least one element of the original metaphor underlies the target expression. This approach perfectly integrates into Lederer’s interpretative model of translation (2006). In this triangular model, the transfer of conceptualisation could be included at the level of ‘deverbalisation/reverbalisation’, the crucial stage of the model, where the extraction of meaning combines with the encyclopedic background to generate the target text.Keywords: cognitive translation, conceptual integration, conceptual metaphor, interpretative model of translation, wine tasting metaphor
Procedia PDF Downloads 131529 Virtual Screening and in Silico Toxicity Property Prediction of Compounds against Mycobacterium tuberculosis Lipoate Protein Ligase B (LipB)
Authors: Junie B. Billones, Maria Constancia O. Carrillo, Voltaire G. Organo, Stephani Joy Y. Macalino, Inno A. Emnacen, Jamie Bernadette A. Sy
Abstract:
The drug discovery and development process is generally known to be a very lengthy and labor-intensive process. Therefore, in order to be able to deliver prompt and effective responses to cure certain diseases, there is an urgent need to reduce the time and resources needed to design, develop, and optimize potential drugs. Computer-aided drug design (CADD) is able to alleviate this issue by applying computational power in order to streamline the whole drug discovery process, starting from target identification to lead optimization. This drug design approach can be predominantly applied to diseases that cause major public health concerns, such as tuberculosis. Hitherto, there has been no concrete cure for this disease, especially with the continuing emergence of drug resistant strains. In this study, CADD is employed for tuberculosis by first identifying a key enzyme in the mycobacterium’s metabolic pathway that would make a good drug target. One such potential target is the lipoate protein ligase B enzyme (LipB), which is a key enzyme in the M. tuberculosis metabolic pathway involved in the biosynthesis of the lipoic acid cofactor. Its expression is considerably up-regulated in patients with multi-drug resistant tuberculosis (MDR-TB) and it has no known back-up mechanism that can take over its function when inhibited, making it an extremely attractive target. Using cutting-edge computational methods, compounds from AnalytiCon Discovery Natural Derivatives database were screened and docked against the LipB enzyme in order to rank them based on their binding affinities. Compounds which have better binding affinities than LipB’s known inhibitor, decanoic acid, were subjected to in silico toxicity evaluation using the ADMET and TOPKAT protocols. Out of the 31,692 compounds in the database, 112 of these showed better binding energies than decanoic acid. Furthermore, 12 out of the 112 compounds showed highly promising ADMET and TOPKAT properties. Future studies involving in vitro or in vivo bioassays may be done to further confirm the therapeutic efficacy of these 12 compounds, which eventually may then lead to a novel class of anti-tuberculosis drugs.Keywords: pharmacophore, molecular docking, lipoate protein ligase B (LipB), ADMET, TOPKAT
Procedia PDF Downloads 423528 Six Years Antimicrobial Resistance Trends among Bacterial Isolates in Amhara National Regional State, Ethiopia
Authors: Asrat Agalu Abejew
Abstract:
Background: Antimicrobial resistance (AMR) is a silent tsunami and one of the top global threats to health care and public health. It is one of the common agendas globally and in Ethiopia. Emerging AMR will be a double burden to Ethiopia, which is facing a series of problems from infectious disease morbidity and mortality. In Ethiopia, although there are attempts to document AMR in healthcare institutions, comprehensive and all-inclusive analysis is still lacking. Thus, this study is aimed to determine trends in AMR from 2016-2021. Methods: A retrospective analysis of secondary data recorded in the Amhara Public Health Institute (APHI) from 2016 to 2021 G.C was conducted. Blood, Urine, Stool, Swabs, Discharge, body effusions, and other Microbiological specimens were collected from each study participants, and Bacteria identification and Resistance tests were done using the standard microbiologic procedure. Data was extracted from excel in August 2022, Trends in AMR were analyzed, and the results were described. In addition, the chi-square (X2) test and binary logistic regression were used, and a P. value < 0.05 was used to determine a significant association. Results: During 6 years period, there were 25143 culture and susceptibility tests. Overall, 265 (46.2%) bacteria were resistant to 2-4 antibiotics, 253 (44.2%) to 5-7 antibiotics, and 56 (9.7%) to >=8 antibiotics. The gram-negative bacteria were 166 (43.9%), 155 (41.5%), and 55 (14.6%) resistant to 2-4, 5-7, and ≥8 antibiotics, respectively, whereas 99(50.8%), 96(49.2% and 1 (0.5%) of gram-positive bacteria were resistant to 2-4, 5-7 and ≥8 antibiotics respectively. K. pneumonia 3783 (15.67%) and E. coli 3199 (13.25%) were the most commonly isolated bacteria, and the overall prevalence of AMR was 2605 (59.9%), where K. pneumonia 743 (80.24%), E. cloacae 196 (74.81%), A. baumannii 213 (66.56%) being the most common resistant bacteria for antibiotics tested. Except for a slight decline during 2020 (6469 (25.4%)), the overall trend of AMR is rising from year to year, with a peak in 2019 (8480 (33.7%)) and in 2021 (7508 (29.9%). If left un-intervened, the trend in AMR will increase by 78% of variation from the study period, as explained by the differences in years (R2=0.7799). Ampicillin, Augmentin, ciprofloxacin, cotrimoxazole, tetracycline, and Tobramycin were almost resistant to common bacteria they were tested. Conclusion: AMR is linearly increasing during the last 6 years. If left as it is without appropriate intervention after 15 years (2030 E.C), AMR will increase by 338.7%. A growing number of multi-drug resistant bacteria is an alarm to awake policymakers and those who do have the concern to intervene before it is too late. This calls for a periodic, integrated, and continuous system to determine the prevalence of AMR in commonly used antibiotics.Keywords: AMR, trend, pattern, MDR
Procedia PDF Downloads 76527 The Role of the Renal Specialist Podiatrist
Authors: Clara Luwe, Oliver Harness, Helena Meally, Kim Martin, Alexandra Harrington
Abstract:
Background: The role of ‘Renal Specialist Podiatrist’ originated in 2022 due to prevailing evidence of patients with diabetes and end-stage renal disease (ESRD) on haemodialysis (HD) and active ulcerations that were at higher risk of rapid deterioration, foot-related hospital admissions, and lower limb amputations. This role started in April 2022 with the aim of screening all patients on haemodialysis and instigating preventative measures to reduce serious foot related complications. Methods: A comprehensive neurovascular foot assessment was completed to establish baseline vascular status and identify those with peripheral arterial disease (PAD) for all patients on HD. Individual’s foot risk was stratified, advice and education tailored and issued. Identifying all diabetes patients on HD as high-risk for diabetic foot complications. Major Findings: All patients screened revealed over half of the caseload had diabetes, and more than half had a clinical presentation of PAD. All those presenting with ulcerations had a diagnosis of diabetes. Of the presenting ulcerations, the majority of these ulcers predated the renal specialist post and were classified as severe >3 SINBAD Score. Since April’22, complications have been identified quicker, reducing the severity (SINBAD<3 or below), and have improved healing times, in line with the national average. During the eight months of the role being in place, we have seen a reduction in minor amputations and no major amputations. Conclusion: By screening all patients on haemodialysis and focusing on education, early recognition of complications, appropriate treatment, and timely onward referral, we can reduce the risk of foot Diabetic foot ulcerations and lower limb amputations. Having regular podiatry input to stratify and facilitate high-risk, active wound patients across different services has helped to keep these patients stable, prevent amputations, and reduce foot-related hospital admissions and mortality from foot-related disease. By improving the accessibility to a specialist podiatrist, patients felt able to raise concerns sooner. This has helped to implement treatment at the earliest possible opportunity, enabling the identification and healing of ulcers at an earlier and less complex stage (SINBAD <3), thus, preventing potential limb-threatening complications.Keywords: renal, podiatry, haemodialysis, prevention, early detection
Procedia PDF Downloads 85526 In vitro Callus Production from Lantana Camara: A Step towards Biotransformation Studies
Authors: Maged El-Sayed Mohamed
Abstract:
Plant tissue culture practices are presented nowadays as the most promising substitute to a whole plant in the terms of secondary metabolites production. They offer the advantages of high production, tunability and they have less effect on plant ecosystems. Lantana camara is a weed, which is common all over the world as an ornamental plant. Weeds can adapt to any type of soil and climate due to their rich cellular machinery for secondary metabolites’ production. This characteristic is found in Lantana camara as a plant of very rich diversity of secondary metabolites with no dominant class of compounds. Aim: This trait has encouraged the author to develop tissue culture experiments for Lantana camara to be a platform for production and manipulation of secondary metabolites through biotransformation. Methodology: The plant was collected in its flowering stage in September 2014, from which explants were prepared from shoot tip, auxiliary bud and leaf. Different types of culture media were tried as well as four phytohormones and their combinations; NAA, 2,4-D, BAP and kinetin. Explants were grown in dark or in 12 hours dark and light cycles at 25°C. A metabolic profile for the produced callus was made and then compared to the whole plant profile. The metabolic profile was made using GC-MS for volatile constituents (extracted by n-hexane) and by HPLC-MS and capillary electrophoresis-mass spectrometry (CE-MS) for non-volatile constituents (extracted by ethanol and water). Results: The best conditions for the callus induction was achieved using MS media supplied with 30 gm sucrose and NAA/BAP (1:0.2 mg/L). Initiation of callus was favoured by incubation in dark for 20 day. The callus produced under these conditions showed yellow colour, which changed to brownish after 30 days. The rate of callus growth was high, expressed in the callus diameter, which reached to 1.15±0.2 cm in 30 days; however, the induction of callus delayed for 15 days. The metabolic profile for both volatile and non-volatile constituents of callus showed more simple background metabolites than the whole plant with two new (unresolved) peaks in the callus’ nonvolatile constituents’ chromatogram. Conclusion: Lantana camara callus production can be itself a source of new secondary metabolites and could be used for biotransformation studies due to its simple metabolic background, which allow easy identification of newly formed metabolites. The callus production gathered the simple metabolic background with the rich cellular secondary metabolite machinery of the plant, which could be elicited to produce valuable medicinally active products.Keywords: capillary electrophoresis-mass spectrometry, gas chromatography, metabolic profile, plant tissue culture
Procedia PDF Downloads 385525 Feasibility Study and Experiment of On-Site Nuclear Material Identification in Fukushima Daiichi Fuel Debris by Compact Neutron Source
Authors: Yudhitya Kusumawati, Yuki Mitsuya, Tomooki Shiba, Mitsuru Uesaka
Abstract:
After the Fukushima Daiichi nuclear power reactor incident, there are a lot of unaccountable nuclear fuel debris in the reactor core area, which is subject to safeguard and criticality safety. Before the actual precise analysis is performed, preliminary on-site screening and mapping of nuclear debris activity need to be performed to provide a reliable data on the nuclear debris mass-extraction planning. Through a collaboration project with Japan Atomic Energy Agency, an on-site nuclear debris screening system by using dual energy X-Ray inspection and neutron energy resonance analysis has been established. By using the compact and mobile pulsed neutron source constructed from 3.95 MeV X-Band electron linac, coupled with Tungsten as electron-to-photon converter and Beryllium as a photon-to-neutron converter, short-distance neutron Time of Flight measurement can be performed. Experiment result shows this system can measure neutron energy spectrum up to 100 eV range with only 2.5 meters Time of Flightpath in regards to the X-Band accelerator’s short pulse. With this, on-site neutron Time of Flight measurement can be used to identify the nuclear debris isotope contents through Neutron Resonance Transmission Analysis (NRTA). Some preliminary NRTA experiments have been done with Tungsten sample as dummy nuclear debris material, which isotopes Tungsten-186 has close energy absorption value with Uranium-238 (15 eV). The results obtained shows that this system can detect energy absorption in the resonance neutron area within 1-100 eV. It can also detect multiple elements in a material at once with the experiment using a combined sample of Indium, Tantalum, and silver makes it feasible to identify debris containing mixed material. This compact neutron Time of Flight measurement system is a great complementary for dual energy X-Ray Computed Tomography (CT) method that can identify atomic number quantitatively but with 1-mm spatial resolution and high error bar. The combination of these two measurement methods will able to perform on-site nuclear debris screening at Fukushima Daiichi reactor core area, providing the data for nuclear debris activity mapping.Keywords: neutron source, neutron resonance, nuclear debris, time of flight
Procedia PDF Downloads 238524 Sedimentary, Diagenesis and Evaluation of High Quality Reservoir of Coarse Clastic Rocks in Nearshore Deep Waters in the Dongying Sag; Bohai Bay Basin
Authors: Kouassi Louis Kra
Abstract:
The nearshore deep-water gravity flow deposits in the Northern steep slope of Dongying depression, Bohai Bay basin, have been acknowledged as important reservoirs in the rift lacustrine basin. These deep strata term as coarse clastic sediment, deposit at the root of the slope have complex depositional processes and involve wide diagenetic events which made high-quality reservoir prediction to be complex. Based on the integrated study of seismic interpretation, sedimentary analysis, petrography, cores samples, wireline logging data, 3D seismic and lithological data, the reservoir formation mechanism deciphered. The Geoframe software was used to analyze 3-D seismic data to interpret the stratigraphy and build a sequence stratigraphic framework. Thin section identification, point counts were performed to assess the reservoir characteristics. The software PetroMod 1D of Schlumberger was utilized for the simulation of burial history. CL and SEM analysis were performed to reveal diagenesis sequences. Backscattered electron (BSE) images were recorded for definition of the textural relationships between diagenetic phases. The result showed that the nearshore steep slope deposits mainly consist of conglomerate, gravel sandstone, pebbly sandstone and fine sandstone interbedded with mudstone. The reservoir is characterized by low-porosity and ultra-low permeability. The diagenesis reactions include compaction, precipitation of calcite, dolomite, kaolinite, quartz cement and dissolution of feldspars and rock fragment. The main types of reservoir space are primary intergranular pores, residual intergranular pores, intergranular dissolved pores, intergranular dissolved pores, and fractures. There are three obvious anomalous high-porosity zones in the reservoir. Overpressure and early hydrocarbon filling are the main reason for abnormal secondary pores development. Sedimentary facies control the formation of high-quality reservoir, oil and gas filling preserves secondary pores from late carbonate cementation.Keywords: Bohai Bay, Dongying Sag, deep strata, formation mechanism, high-quality reservoir
Procedia PDF Downloads 135523 Haematological Correlates of Ischemic Stroke and Transient Ischemic Attack: Lessons Learned
Authors: Himali Gunasekara, Baddika Jayaratne
Abstract:
Haematological abnormalities are known to cause Ischemic Stroke or Transient Ischemic Attack (TIA). The identification of haematological correlates plays an important role in a management and secondary prevention. The objective of this study was to describe haematological correlates of stroke and their association between stroke profile. The haematological correlates screened were Lupus Anticoagulant, Dysfibroginemia, Paroxysmal nocturnal haemoglobinurea (PNH), Sickle cell disease, Systemic Lupus Erythematosis (SLE) and Myeloploriferative Neoplasms (MPN). A cross sectional descriptive study was conducted in a sample of 152 stroke patients referred to haematology department of National Hospital of Sri Lanka for thrombophilia screening. Different tests were performed to assess each hematological correlate. Diluted Russels Viper Venom Test and Kaolin clotting time were done to assess Lupus anticoagulant. Full blood count (FBC), blood picture, Sickling test and High Performance Liquid Chromatography were the tests used for detection of Sickle cell disease. Paroxysmal nocturnal haemoglobinurea was assessed by FBC, blood picture, Ham test and Flowcytometry. FBC, blood picture, Janus Kinase 2 (V617F) mutation analysis, erythropoietin level and bone marrow examination were done to look for the Myeloproliferative neoplasms. Dysfibrinogenaemia was assessed by TT, fibrinogen antigen test, clot observation and clauss test. Anti nuclear antibody test was done to look for systemic lupus erythematosis. Among study sample, 134 patients had strokes and only 18 had TIA. The recurrence of stroke/TIA was observed in 13.2% of patients. The majority of patients (94.7%) have had radiological evidence of thrombotic event. One fourth of patients had past thrombotic events while 12.5% had family history of thrombosis. Out of haematological correlates screened, Lupus anticoagulant was the commonest haematological correlate (n=16 ) and dysfibrigonaemia(n=11 ) had the next high prevalence. One patient was diagnosed with Essential thrombocythaemia and one with SLE. None of the patients were positive for screening tests done for sickle cell disease and PNH. The Haematological correlates were identified in 19% of our study sample. Among stroke profile only presence of past thrombotic history was statistically significantly associated with haematological disorders (P= 0.04). Therefore, hematological disorders appear to be an important factor in etiological work-up of stroke patients particularly in patients with past thrombotic events.Keywords: stroke, transient ischemic attack, hematological correlates, hematological disorders
Procedia PDF Downloads 236522 Coping Strategies Used by Persons with Spinal Cord Injury: A Rehabilitation Hospital Based Qualitative Study
Authors: P. W. G. D. P. Samarasekara, S. M. K. S. Seneviratne, D. Munidasa, S. S. Williams
Abstract:
Sustaining a spinal cord injury (SCI) causes severe disruption of all aspects of a person’s life, resulting in the difficult process of coping with the distressing effects of paralysis affecting their ability to lead a meaningful life. These persons are hospitalized in the acute stage of injury and subsequently for rehabilitation and the treatment of complications. The purpose of this study was to explore coping strategies used by persons with SCI during their rehabilitation period. A qualitative study was conducted among persons with SCI, undergoing rehabilitation at the Rheumatology and Rehabilitation Hospitals, Ragama and Digana Sri Lanka. Twelve participants were selected purposively to represent both males and females, with cervical, thoracic or lumbar levels of injuries due to traumatic and non-traumatic causes as well as from different socioeconomic backgrounds. Informed consent was taken from the participants. In-depth interviews were conducted using an interview guide to collect data. Probes were used to get more information and to encourage participants. Interviews were audio taped and transcribed verbatim. Qualitative content analysis was conducted. Ethical approval for this study was obtained from the Ethics Review Committee, Faculty of Medicine, University of Kelaniya. Five themes were identified in the content analysis: social support, religious beliefs, determination, acceptance and making comparisons. Participants indicated that the support from their family members had been an essential factor in coping, after sustaining an SCI and they expressed the importance of emotional support from family members during their rehabilitation. Many participants had a strong belief towards the God, who had a personal interest in their lives, played an important role in their ability to cope with the injury. They believed that what happens to them in this life results from their actions in previous lives. They expressed that determination was essential as a factor that helps them cope with their injury. They indicated their focus on the positive aspects of the life and accepted the disability. They made comparisons to other persons who were worse off than them to help lift them out of unpleasant experience. Even some of the most severely injured and disabled participants presented evidence of using this coping strategy. Identification of coping strategies used by persons with SCI will help nurses and other health-care professionals in reinforcing the most effective coping strategies among persons with SCI. The findings recommend that engagement coping positively influences psychosocial adaptation.Keywords: content analysis, coping strategies, rehabilitation, spinal cord injury
Procedia PDF Downloads 184