Search results for: water-efficient techniques
5711 Evaluation of Newly Synthesized Steroid Derivatives Using In silico Molecular Descriptors and Chemometric Techniques
Authors: Milica Ž. Karadžić, Lidija R. Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Z. Kovačević, Anamarija I. Mandić, Katarina Penov-Gaši, Andrea R. Nikolić, Aleksandar M. Oklješa
Abstract:
This study considered selection of the in silico molecular descriptors and the models for newly synthesized steroid derivatives description and their characterization using chemometric techniques. Multiple linear regression (MLR) models were established and gave the best molecular descriptors for quantitative structure-retention relationship (QSRR) modeling of the retention of the investigated molecules. MLR models were without multicollinearity among the selected molecular descriptors according to the variance inflation factor (VIF) values. Used molecular descriptors were ranked using generalized pair correlation method (GPCM). In this method, the significant difference between independent variables can be noticed regardless almost equal correlation between dependent variable. Generated MLR models were statistically and cross-validated and the best models were kept. Models were ranked using sum of ranking differences (SRD) method. According to this method, the most consistent QSRR model can be found and similarity or dissimilarity between the models could be noticed. In this study, SRD was performed using average values of experimentally observed data as a golden standard. Chemometric analysis was conducted in order to characterize newly synthesized steroid derivatives for further investigation regarding their potential biological activity and further synthesis. This article is based upon work from COST Action (CM1105), supported by COST (European Cooperation in Science and Technology).Keywords: generalized pair correlation method, molecular descriptors, regression analysis, steroids, sum of ranking differences
Procedia PDF Downloads 3485710 Comparative Evaluation of Vanishing Interfacial Tension Approach for Minimum Miscibility Pressure Determination
Authors: Waqar Ahmad Butt, Gholamreza Vakili Nezhaad, Ali Soud Al Bemani, Yahya Al Wahaibi
Abstract:
Minimum miscibility pressure (MMP) plays a great role in determining the displacement efficiency of different gas injection processes. Experimental techniques for MMP determination include industrially recommended slim tube, vanishing interfacial tension (VIT) and rising bubble apparatus (RBA). In this paper, MMP measurement study using slim tube and VIT experimental techniques for two different crude oil samples (M and N) both in live and stock tank oil forms is being presented. VIT measured MMP values for both 'M' and 'N' live crude oils were close to slim tube determined MMP values with 6.4 and 5 % deviation respectively. Whereas for both oil samples in stock tank oil form, VIT measured MMP showed a higher unacceptable deviation from slim tube determined MMP. This higher difference appears to be related to high stabilized crude oil heavier fraction and lack of multiple contacts miscibility. None of the different nine deployed crude oil and CO2 MMP computing correlations could result in reliable MMP, close to slim tube determined MMP. Since VIT determined MMP values for both considered live crude oils are in close match with slim tube determined MMP values, it confirms reliable, reproducible, rapid and cheap alternative for live crude oil MMP determination. Whereas VIT MMP determination for stock tank oil case needed further investigation about stabilization / destabilization mechanism of oil heavier ends and multiple contacts miscibility development issues.Keywords: minimum miscibility pressure, interfacial tension, multiple contacts miscibility, heavier ends
Procedia PDF Downloads 2695709 Examination of Public Hospital Unions Technical Efficiencies Using Data Envelopment Analysis and Machine Learning Techniques
Authors: Songul Cinaroglu
Abstract:
Regional planning in health has gained speed for developing countries in recent years. In Turkey, 89 different Public Hospital Unions (PHUs) were conducted based on provincial levels. In this study technical efficiencies of 89 PHUs were examined by using Data Envelopment Analysis (DEA) and machine learning techniques by dividing them into two clusters in terms of similarities of input and output indicators. Number of beds, physicians and nurses determined as input variables and number of outpatients, inpatients and surgical operations determined as output indicators. Before performing DEA, PHUs were grouped into two clusters. It is seen that the first cluster represents PHUs which have higher population, demand and service density than the others. The difference between clusters was statistically significant in terms of all study variables (p ˂ 0.001). After clustering, DEA was performed for general and for two clusters separately. It was found that 11% of PHUs were efficient in general, additionally 21% and 17% of them were efficient for the first and second clusters respectively. It is seen that PHUs, which are representing urban parts of the country and have higher population and service density, are more efficient than others. Random forest decision tree graph shows that number of inpatients is a determinative factor of efficiency of PHUs, which is a measure of service density. It is advisable for public health policy makers to use statistical learning methods in resource planning decisions to improve efficiency in health care.Keywords: public hospital unions, efficiency, data envelopment analysis, random forest
Procedia PDF Downloads 1275708 Qf-Pcr as a Rapid Technique for Routine Prenatal Diagnosis of Fetal Aneuploidies
Authors: S. H. Atef
Abstract:
Background: The most common chromosomal abnormalities identified at birth are aneuploidies of chromosome 21, 18, 13, X and Y. Prenatal diagnosis of fetal aneuploidies is routinely done by traditional cytogenetic culture, a major drawback of this technique is the long period of time required to reach a diagnosis. In this study, we evaluated the QF-PCR as a rapid technique for prenatal diagnosis of common aneuploidies. Method:This work was carried out on Sixty amniotic fluid samples taken from patients with one or more of the following indications: Advanced maternal age (3 case), abnormal biochemical markers (6 cases), abnormal ultrasound (12 cases) or previous history of abnormal child (39 cases).Each sample was tested by QF-PCR and traditional cytogenetic. Aneuploidy screenings were performed amplifying four STRs on chromosomes 21, 18, 13, two pseudoautosomal,one X linked, as well as the AMXY and SRY; markers were distributed in two multiplex QFPCR assays (S1 and S2) in order to reduce the risk of sample mishandling. Results: All the QF-PCR results were successful, while there was two culture failures, only one of them was repeated. No discrepancy was seen between the results of both techniques. Fifty six samples showed normal patterns, three sample showed trisomy 21, successfully detected by both techniques and one sample showed normal pattern by QF-PCR but could not be compared to the cytogenetics due to culture failure, the pregnancy outcome of this case was a normal baby. Conclusion: Our study concluded that QF-PCR is a reliable technique for prenatal diagnosis of the common chromosomal aneuploidies. It has the advantages over the cytogenetic culture of being faster with the results appearing within 24-48 hours, simpler, doesn't need a highly qualified staff, less prone to failure and more cost effective.Keywords: QF-PCR, traditional cytogenetic fetal aneuploidies, trisomy 21, prenatal diagnosis
Procedia PDF Downloads 4195707 Applications of Out-of-Sequence Thrust Movement for Earthquake Mitigation: A Review
Authors: Rajkumar Ghosh
Abstract:
The study presents an overview of the many uses and approaches for estimating out-of-sequence thrust movement in earthquake mitigation. The study investigates how knowing and forecasting thrust movement during seismic occurrences might assist to effective earthquake mitigation measures. The review begins by discussing out-of-sequence thrust movement and its importance in earthquake mitigation strategies. It explores how typical techniques of estimating thrust movement may not capture the full complexity of seismic occurrences and emphasizes the benefits of include out-of-sequence data in the analysis. A thorough review of existing research and studies on out-of-sequence thrust movement estimates for earthquake mitigation. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources such as GPS measurements, satellite imagery, and seismic recordings. The study also examines the use of out-of-sequence thrust movement estimates in earthquake mitigation measures. It investigates how precise calculation of thrust movement may help improve structural design, analyse infrastructure risk, and develop early warning systems. The potential advantages of using out-of-sequence data in these applications to improve the efficiency of earthquake mitigation techniques. The difficulties and limits of estimating out-of-sequence thrust movement for earthquake mitigation. It addresses data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and increase the accuracy and reliability of out-of-sequence thrust movement estimates, the authors recommend topics for additional study and improvement. The study is a helpful resource for seismic monitoring and earthquake risk assessment researchers, engineers, and policymakers, supporting innovations in earthquake mitigation measures based on a better knowledge of thrust movement dynamics.Keywords: earthquake mitigation, out-of-sequence thrust, satellite imagery, seismic recordings, GPS measurements
Procedia PDF Downloads 875706 Check Red Blood Cells Concentrations of a Blood Sample by Using Photoconductive Antenna
Authors: Ahmed Banda, Alaa Maghrabi, Aiman Fakieh
Abstract:
Terahertz (THz) range lies in the area between 0.1 to 10 THz. The process of generating and detecting THz can be done through different techniques. One of the most familiar techniques is done through a photoconductive antenna (PCA). The process of generating THz radiation at PCA includes applying a laser pump in femtosecond and DC voltage difference. However, photocurrent is generated at PCA, which its value is affected by different parameters (e.g., dielectric properties, DC voltage difference and incident power of laser pump). THz radiation is used for biomedical applications. However, different biomedical fields need new technologies to meet patients’ needs (e.g. blood-related conditions). In this work, a novel method to check the red blood cells (RBCs) concentration of a blood sample using PCA is presented. RBCs constitute 44% of total blood volume. RBCs contain Hemoglobin that transfers oxygen from lungs to body organs. Then it returns to the lungs carrying carbon dioxide, which the body then gets rid of in the process of exhalation. The configuration has been simulated and optimized using COMSOL Multiphysics. The differentiation of RBCs concentration affects its dielectric properties (e.g., the relative permittivity of RBCs in the blood sample). However, the effects of four blood samples (with different concentrations of RBCs) on photocurrent value have been tested. Photocurrent peak value and RBCs concentration are inversely proportional to each other due to the change of dielectric properties of RBCs. It was noticed that photocurrent peak value has dropped from 162.99 nA to 108.66 nA when RBCs concentration has risen from 0% to 100% of a blood sample. The optimization of this method helps to launch new products for diagnosing blood-related conditions (e.g., anemia and leukemia). The resultant electric field from DC components can not be used to count the RBCs of the blood sample.Keywords: biomedical applications, photoconductive antenna, photocurrent, red blood cells, THz radiation
Procedia PDF Downloads 2055705 Teacher Training Course: Conflict Resolution through Mediation
Authors: Csilla Marianna Szabó
Abstract:
In Hungary, the society has changes a lot for the past 25 years, and these changes could be detected in educational situations as well. The number and the intensity of conflicts have been increased in most fields of life, as well as at schools. Teachers have difficulties to be able to handle school conflicts. What is more, the new net generation, generation Z has values and behavioural patterns different from those of the previous one, which might generate more serious conflicts at school, especially with teachers who were mainly socialising in a traditional teacher – student relationships. In Hungary, the bill CCIV, 2011 declared the foundation of Institutes of Teacher Training in higher education institutes. One of the tasks of the Institutes is to survey the competences and needs of teachers working in public education and to provide further trainings and services for them according to their needs and requirements. This job is supported by the Social Renewal Operative Programs 4.1.2.B. The Institute of Teacher Training at the College of Dunaújváros, Hungary carried out a questionnaire and surveyed the needs and the requirements of teachers working in the Central Transdanubian region. Based on the results, the professors of the Institute of Teacher Training decided to meet the requirements of teachers and launch short courses in spring 2015. One of the courses is going to focus on school conflict management through mediation. The aim of the pilot course is to provide conflict management techniques for teachers presenting different mediation techniques to them. The theoretical part of the course (5 hours) will enable participants to understand the main points and the advantages of mediation, while the practical part (10 hours) will involve teachers in role plays to learn how to cope with conflict situations applying mediation. We hope if conflicts could be reduced, it would influence school atmosphere in a positive way and the teaching – learning process could be more successful and effective.Keywords: conflict resolution, generation Z, mediation, teacher training
Procedia PDF Downloads 4125704 Influence of the Cooking Technique on the Iodine Content of Frozen Hake
Authors: F. Deng, R. Sanchez, A. Beltran, S. Maestre
Abstract:
The high nutritional value associated with seafood is related to the presence of essential trace elements. Moreover, seafood is considered an important source of energy, proteins, and long-chain polyunsaturated fatty acids. Generally, seafood is consumed cooked. Consequently, the nutritional value could be degraded. Seafood, such as fish, shellfish, and seaweed, could be considered as one of the main iodine sources. The deficient or excessive consumption of iodine could cause dysfunction and pathologies related to the thyroid gland. The main objective of this work is to evaluated iodine stability in hake (Merluccius) undergone different culinary techniques. The culinary process considered were: boiling, steaming, microwave cooking, baking, cooking en papillote (twisted cover with the shape of a sweet wrapper) and coating with a batter of flour and deep-frying. The determination of iodine was carried by Inductively Coupled Plasma Mass Spectrometry (ICP-MS). Regarding sample handling strategies, liquid-liquid extraction has demonstrated to be a powerful pre-concentration and clean-up approach for trace metal analysis by ICP techniques. Extraction with tetramethylammonium hydroxide (TMAH reagent) was used as a sample preparation method in this work. Based on the results, it can be concluded that the stability of iodine was degraded with the cooking processes. The major degradation was observed for the boiling and microwave cooking processes. The content of iodine in hake decreased up to 60% and 52%, respectively. However, if the boiling cooking liquid is preserved, this loss that has been generated during cooking is reduced. Only when the fish was cooked by following the cooking en papillote process the iodine content was preserved.Keywords: cooking process, ICP-MS, iodine, hake
Procedia PDF Downloads 1425703 Exploring the ‘Many Worlds’ Interpretation in Both a Philosophical and Creative Literary Framework
Authors: Jane Larkin
Abstract:
Combining elements of philosophy, science, and creative writing, this investigation explores how a philosophically structured science-fiction novel can challenge the theory of linearity and singularity of time through the ‘many worlds’ theory. This concept is addressed through the creation of a research exegesis and accompanying creative artefact, designed to be read in conjunction with each other in an explorative, interwoven manner. Research undertaken into scientific concepts, such as the ‘many worlds’ interpretation of quantum mechanics and diverse philosophers and their ideologies on time, is embodied in an original science-fiction narrative titled, It Goes On. The five frames that make up the creative artefact are enhanced not only by five leading philosophers and their philosophies on time but by an appreciation of the research, which comes first in the paper. Research into traditional approaches to storytelling is creatively and innovatively inverted in several ways, thus challenging the singularity and linearity of time. Further nonconventional approaches to literary techniques include an abstract narrator, embodied by time, a concept, and a figure in the text, whose voice and vantage point in relation to death furthers the unreliability of the notion of time. These further challenge individuals’ understanding of complex scientific and philosophical views in a variety of ways. The science-fiction genre is essential when considering the speculative nature of It Goes On, which deals with parallel realities and is a fantastical exploration of human ingenuity in plausible futures. Therefore, this paper documents the research-led methodology used to create It Goes On, the application of the ‘many worlds’ theory within a framed narrative, and the many innovative techniques used to contribute new knowledge in a variety of fields.Keywords: time, many-worlds theory, Heideggerian philosophy, framed narrative
Procedia PDF Downloads 865702 Expert System for Road Bridge Constructions
Authors: Michael Dimmer, Holger Flederer
Abstract:
The basis of realizing a construction project is a technically flawless concept which satisfies conditions regarding environment and costs, as well as static-constructional terms. The presented software system actively supports civil engineers during the setup of optimal designs, by giving advice regarding durability, life-cycle costs, sustainability and much more. A major part of the surrounding conditions of a design process is gathered and assimilated by experienced engineers subconsciously. It is a question about eligible building techniques and their practicability by considering emerging costs. Planning engineers have acquired many of this experience during their professional life and use them for their daily work. Occasionally, the planning engineer should disassociate himself from his experience to be open for new and better solutions which meet the functional demands, as well. The developed expert system gives planning engineers recommendations for preferred design options of new constructions as well as for existing bridge constructions. It is possible to analyze construction elements and techniques regarding sustainability and life-cycle costs. This way the software provides recommendations for future constructions. Furthermore, there is an option to design existing road bridges especially for heavy duty transport. This implies a route planning tool to get quick and reliable information as to whether the bridge support structures of a transport route have been measured sufficiently for a certain heavy duty transport. The use of this expert system in bridge planning companies and building authorities will save costs massively for new and existent bridge constructions. This is achieved by consequently considering parameters like life-cycle costs and sustainability for its planning recommendations.Keywords: expert system, planning process, road bridges, software system
Procedia PDF Downloads 2775701 The Role of Vibro-Stone Column for Enhancing the Soft Soil Properties
Authors: Mohsen Ramezan Shirazi, Orod Zarrin, Komeil Valipourian
Abstract:
This study investigated the behavior of improved soft soils through the vibro replacement technique by considering their settlements and consolidation rates and the applicability of this technique in various types of soils and settlement and bearing capacity calculations.Keywords: bearing capacity, expansive clay, stone columns, vibro techniques
Procedia PDF Downloads 5865700 Automatic Detection and Filtering of Negative Emotion-Bearing Contents from Social Media in Amharic Using Sentiment Analysis and Deep Learning Methods
Authors: Derejaw Lake Melie, Alemu Kumlachew Tegegne
Abstract:
The increasing prevalence of social media in Ethiopia has exacerbated societal challenges by fostering the proliferation of negative emotional posts and comments. Illicit use of social media has further exacerbated divisions among the population. Addressing these issues through manual identification and aggregation of emotions from millions of users for swift decision-making poses significant challenges, particularly given the rapid growth of Amharic language usage on social platforms. Consequently, there is a critical need to develop an intelligent system capable of automatically detecting and categorizing negative emotional content into social, religious, and political categories while also filtering out toxic online content. This paper aims to leverage sentiment analysis techniques to achieve automatic detection and filtering of negative emotional content from Amharic social media texts, employing a comparative study of deep learning algorithms. The study utilized a dataset comprising 29,962 comments collected from social media platforms using comment exporter software. Data pre-processing techniques were applied to enhance data quality, followed by the implementation of deep learning methods for training, testing, and evaluation. The results showed that CNN, GRU, LSTM, and Bi-LSTM classification models achieved accuracies of 83%, 50%, 84%, and 86%, respectively. Among these models, Bi-LSTM demonstrated the highest accuracy of 86% in the experiment.Keywords: negative emotion, emotion detection, social media filtering sentiment analysis, deep learning.
Procedia PDF Downloads 335699 Synthesis of MIPs towards Precursors and Intermediates of Illicit Drugs and Their following Application in Sensing Unit
Authors: K. Graniczkowska, N. Beloglazova, S. De Saeger
Abstract:
The threat of synthetic drugs is one of the most significant current drug problems worldwide. The use of drugs of abuse has increased dramatically during the past three decades. Among others, Amphetamine-Type Stimulants (ATS) are globally the second most widely used drugs after cannabis, exceeding the use of cocaine and heroin. ATS are potent central nervous system (CNS) stimulants, capable of inducing euphoric static similar to cocaine. Recreational use of ATS is widespread, even though warnings of irreversible damage of the CNS were reported. ATS pose a big problem and their production contributes to the pollution of the environment by discharging big volumes of liquid waste to sewage system. Therefore, there is a demand to develop robust and sensitive sensors that can detect ATS and their intermediates in environmental water samples. A rapid and simple test is required. Analysis of environmental water samples (which sometimes can be a harsh environment) using antibody-based tests cannot be applied. Therefore, molecular imprinted polymers (MIPs), which are known as synthetic antibodies, have been chosen for that approach. MIPs are characterized with a high mechanical and thermal stability, show chemical resistance in a broad pH range and various organic or aqueous solvents. These properties make them the preferred type of receptors for application in the harsh conditions imposed by environmental samples. To the best of our knowledge, there are no existing MIPs-based sensors toward amphetamine and its intermediates. Also not many commercial MIPs for this application are available. Therefore, the aim of this study was to compare different techniques to obtain MIPs with high specificity towards ATS and characterize them for following use in a sensing unit. MIPs against amphetamine and its intermediates were synthesized using a few different techniques, such as electro-, thermo- and UV-initiated polymerization. Different monomers, cross linkers and initiators, in various ratios, were tested to obtain the best sensitivity and polymers properties. Subsequently, specificity and selectivity were compared with commercially available MIPs against amphetamine. Different linkers, such as lipoic acid, 3-mercaptopioponic acid and tyramine were examined, in combination with several immobilization techniques, to select the best procedure for attaching particles on sensor surface. Performed experiments allowed choosing an optimal method for the intended sensor application. Stability of MIPs in extreme conditions, such as highly acidic or basic was determined. Obtained results led to the conclusion about MIPs based sensor applicability in sewage system testing.Keywords: amphetamine type stimulants, environment, molecular imprinted polymers, MIPs, sensor
Procedia PDF Downloads 2515698 A Case Study Approach on Co-Constructing the Idea of 'Safety' with Children
Authors: Beng Zhen Yeow
Abstract:
In most work that involves children, the voice of the children is often not heard. This is ironic since a lot of discussions might involve their welfare and safety. It might seem natural that the professionals should hear from them about what they wish for instead of deciding what is best for them. However, this, unfortunately, might be more the exception than the norm in most case and hence in many instances, children are merely 'subjects' in conversations about safety instead of active participants in the construction or creation of safety in the family. There might be many reasons why it does not happen in our work. Firstly, professionals have learnt how to 'socialise' into their professional roles and hence in the process become 'un-childlike'. Secondly, there is also a lack of professional training with regards to how to talk with children. Finally, there might be also a lack of concrete tools and techniques that are developed to facilitate the process. In this paper, the case study method is used to show how the idea of safety could be concretised and discussed with children and their family members, and hence making them active participants and co-creators of their own safety. Specific skills and techniques are highlighted through the case study. In this case, there was improvement in outcomes like no repeated offence or abuse. In addition, children were also able to advocate for their own safety after six months of intervention and how the family members were able to explicitly say what they can do to improve safety. The professionals in the safety network reported significant improvements. On top of that, the abused child who was removed due to child protection concerns, had verbalized observations of change in mother’s parenting abilities, and has requested for home leave to begin due to ownership of safety planning and having confidence to co-create safety for her siblings and herself together with the professionals in the safety network. Children becoming active participants in the co-creation of safety not only serve the purpose in allowing them to own a 'voice' but at the same time, give them greater confidence to protect themselves at home and in other contexts outside of home.Keywords: partnering for safety, collaborative social work, family and systemic psychotherapy, child protection
Procedia PDF Downloads 1205697 Hybrid Approach for Face Recognition Combining Gabor Wavelet and Linear Discriminant Analysis
Authors: A: Annis Fathima, V. Vaidehi, S. Ajitha
Abstract:
Face recognition system finds many applications in surveillance and human computer interaction systems. As the applications using face recognition systems are of much importance and demand more accuracy, more robustness in the face recognition system is expected with less computation time. In this paper, a hybrid approach for face recognition combining Gabor Wavelet and Linear Discriminant Analysis (HGWLDA) is proposed. The normalized input grayscale image is approximated and reduced in dimension to lower the processing overhead for Gabor filters. This image is convolved with bank of Gabor filters with varying scales and orientations. LDA, a subspace analysis techniques are used to reduce the intra-class space and maximize the inter-class space. The techniques used are 2-dimensional Linear Discriminant Analysis (2D-LDA), 2-dimensional bidirectional LDA ((2D)2LDA), Weighted 2-dimensional bidirectional Linear Discriminant Analysis (Wt (2D)2 LDA). LDA reduces the feature dimension by extracting the features with greater variance. k-Nearest Neighbour (k-NN) classifier is used to classify and recognize the test image by comparing its feature with each of the training set features. The HGWLDA approach is robust against illumination conditions as the Gabor features are illumination invariant. This approach also aims at a better recognition rate using less number of features for varying expressions. The performance of the proposed HGWLDA approaches is evaluated using AT&T database, MIT-India face database and faces94 database. It is found that the proposed HGWLDA approach provides better results than the existing Gabor approach.Keywords: face recognition, Gabor wavelet, LDA, k-NN classifier
Procedia PDF Downloads 4675696 Particle Swarm Optimization Algorithm vs. Genetic Algorithm for Image Watermarking Based Discrete Wavelet Transform
Authors: Omaima N. Ahmad AL-Allaf
Abstract:
Over communication networks, images can be easily copied and distributed in an illegal way. The copyright protection for authors and owners is necessary. Therefore, the digital watermarking techniques play an important role as a valid solution for authority problems. Digital image watermarking techniques are used to hide watermarks into images to achieve copyright protection and prevent its illegal copy. Watermarks need to be robust to attacks and maintain data quality. Therefore, we discussed in this paper two approaches for image watermarking, first is based on Particle Swarm Optimization (PSO) and the second approach is based on Genetic Algorithm (GA). Discrete wavelet transformation (DWT) is used with the two approaches separately for embedding process to cover image transformation. Each of PSO and GA is based on co-relation coefficient to detect the high energy coefficient watermark bit in the original image and then hide the watermark in original image. Many experiments were conducted for the two approaches with different values of PSO and GA parameters. From experiments, PSO approach got better results with PSNR equal 53, MSE equal 0.0039. Whereas GA approach got PSNR equal 50.5 and MSE equal 0.0048 when using population size equal to 100, number of iterations equal to 150 and 3×3 block. According to the results, we can note that small block size can affect the quality of image watermarking based PSO/GA because small block size can increase the search area of the watermarking image. Better PSO results were obtained when using swarm size equal to 100.Keywords: image watermarking, genetic algorithm, particle swarm optimization, discrete wavelet transform
Procedia PDF Downloads 2285695 Pre-Implementation of Total Body Irradiation Using Volumetric Modulated Arc Therapy: Full Body Anthropomorphic Phantom Development
Authors: Susana Gonçalves, Joana Lencart, Anabela Gregório Dias
Abstract:
Introduction: In combination with chemotherapy, Total Body Irradiation (TBI) is most used as part of the conditioning regimen prior to allogeneic hematopoietic stem cell transplantation. Conventional TBI techniques have a long application time but non-conformality of beam-application with the inability to individually spare organs at risk. Our institution’s intention is to start using Volumetric Modulated Arc Therapy (VMAT) techniques to increase homogeneity of delivered radiation. As a first approach, a dosimetric plan was performed on a computed tomography (CT) scan of a Rando Alderson antropomorfic phantom (head and torso), using a set of six arcs distributed along the phantom. However, a full body anthropomorphic phantom is essential to carry out technique validation and implementation. Our aim is to define the physical and chemical characteristics and the ideal manufacturing procedure of upper and lower limbs to our anthropomorphic phantom, for later validate TBI using VMAT. Materials and Methods: To study the better fit between our phantom and limbs, a CT scan of Rando Alderson anthropomorphic phantom was acquired. CT was performed on GE Healthcare equipment (model Optima CT580 W), with slice thickness of 2.5 mm. This CT was also used to access the electronic density of soft tissue and bone through Hounsfield units (HU) analysis. Results: CT images were analyzed and measures were made for the ideal upper and lower limbs. Upper limbs should be build under the following measures: 43cm length and 7cm diameter (next to the shoulder section). Lower limbs should be build under the following measures: 79cm length and 16.5cm diameter (next to the thigh section). As expected, soft tissue and bone have very different electronic density. This is important to choose and analyze different materials to better represent soft tissue and bone characteristics. The approximate HU values of the soft tissue and for bone shall be 35HU and 250HU, respectively. Conclusion: At the moment, several compounds are being developed based on different types of resins and additives in order to be able to control and mimic the various constituent densities of the tissues. Concurrently, several manufacturing techniques are being explored to make it possible to produce the upper and lower limbs in a simple and non-expensive way, in order to finally carry out a systematic and appropriate study of the total body irradiation. This preliminary study was a good starting point to demonstrate the feasibility of TBI with VMAT.Keywords: TBI, VMAT, anthropomorphic phantom, tissue equivalent materials
Procedia PDF Downloads 805694 The Application of Lesson Study Model in Writing Review Text in Junior High School
Authors: Sulastriningsih Djumingin
Abstract:
This study has some objectives. It aims at describing the ability of the second-grade students to write review text without applying the Lesson Study model at SMPN 18 Makassar. Second, it seeks to describe the ability of the second-grade students to write review text by applying the Lesson Study model at SMPN 18 Makassar. Third, it aims at testing the effectiveness of the Lesson Study model in writing review text at SMPN 18 Makassar. This research was true experimental design with posttest Only group design involving two groups consisting of one class of the control group and one class of the experimental group. The research populations were all the second-grade students at SMPN 18 Makassar amounted to 250 students consisting of 8 classes. The sampling technique was purposive sampling technique. The control class was VIII2 consisting of 30 students, while the experimental class was VIII8 consisting of 30 students. The research instruments were in the form of observation and tests. The collected data were analyzed using descriptive statistical techniques and inferential statistical techniques with t-test types processed using SPSS 21 for windows. The results shows that: (1) of 30 students in control class, there are only 14 (47%) students who get the score more than 7.5, categorized as inadequate; (2) in the experimental class, there are 26 (87%) students who obtain the score of 7.5, categorized as adequate; (3) the Lesson Study models is effective to be applied in writing review text. Based on the comparison of the ability of the control class and experimental class, it indicates that the value of t-count is greater than the value of t-table (2.411> 1.667). It means that the alternative hypothesis (H1) proposed by the researcher is accepted.Keywords: application, lesson study, review text, writing
Procedia PDF Downloads 2035693 Empowering Transformers for Evidence-Based Medicine
Authors: Jinan Fiaidhi, Hashmath Shaik
Abstract:
Breaking the barrier for practicing evidence-based medicine relies on effective methods for rapidly identifying relevant evidence from the body of biomedical literature. An important challenge confronted by medical practitioners is the long time needed to browse, filter, summarize and compile information from different medical resources. Deep learning can help in solving this based on automatic question answering (Q&A) and transformers. However, Q&A and transformer technologies are not trained to answer clinical queries that can be used for evidence-based practice, nor can they respond to structured clinical questioning protocols like PICO (Patient/Problem, Intervention, Comparison and Outcome). This article describes the use of deep learning techniques for Q&A that are based on transformer models like BERT and GPT to answer PICO clinical questions that can be used for evidence-based practice extracted from sound medical research resources like PubMed. We are reporting acceptable clinical answers that are supported by findings from PubMed. Our transformer methods are reaching an acceptable state-of-the-art performance based on two staged bootstrapping processes involving filtering relevant articles followed by identifying articles that support the requested outcome expressed by the PICO question. Moreover, we are also reporting experimentations to empower our bootstrapping techniques with patch attention to the most important keywords in the clinical case and the PICO questions. Our bootstrapped patched with attention is showing relevancy of the evidence collected based on entropy metrics.Keywords: automatic question answering, PICO questions, evidence-based medicine, generative models, LLM transformers
Procedia PDF Downloads 475692 Overview of Pre-Analytical Lab Errors in a Tertiary Care Hospital at Rawalpindi, Pakistan
Authors: S. Saeed, T. Butt, M. Rehan, S. Khaliq
Abstract:
Objective: To determine the frequency of pre-analytical errors in samples taken from patients for various lab tests at Fauji Foundation Hospital, Rawalpindi. Material and Methods: All the lab specimens for diagnostic purposes received at the lab from Fauji Foundation hospital, Rawalpindi indoor and outdoor patients were included. Total number of samples received in the lab is recorded in the computerized program made for the hospital. All the errors observed for pre-analytical process including patient identification, sampling techniques, test collection procedures, specimen transport/processing and storage were recorded in the log book kept for the purpose. Results: A total of 476616 specimens were received in the lab during the period of study including 237931 and 238685 from outdoor and indoor patients respectively. Forty-one percent of the samples (n=197976) revealed pre-analytical discrepancies. The discrepancies included Hemolyzed samples (34.8%), Clotted blood (27.8%), Incorrect samples (17.4%), Unlabeled samples (8.9%), Insufficient specimens (3.9%), Request forms without authorized signature (2.9%), Empty containers (3.9%) and tube breakage during centrifugation (0.8%). Most of these pre-analytical discrepancies were observed in samples received from the wards revealing that inappropriate sample collection by the medical staff of the ward, as most of the outdoor samples are collected by the lab staff who are properly trained for sample collection. Conclusion: It is mandatory to educate phlebotomists and paramedical staff particularly performing duties in the wards regarding timing and techniques of sampling/appropriate container to use/early delivery of the samples to the lab to reduce pre-analytical errors.Keywords: pre analytical lab errors, tertiary care hospital, hemolyzed, paramedical staff
Procedia PDF Downloads 2045691 Preparation of Conductive Composite Fiber by the Reduction of Silver Particles onto Hydrolyzed Polyacrylonitrile Fiber
Authors: Z. Okay, M. Kalkan Erdoğan, M. Şahin, M. Saçak
Abstract:
Polyacrylonitrile (PAN) is one of the most common and cheap fiber-forming polymers because of its high strength and high abrasion resistance properties. The result of alkaline hydrolysis of PAN fiber could be formed the products with conjugated sequences of –C=N–, acrylamide, sodium acrylate, and amidine. In this study, PAN fiber was hydrolyzed in a solution of sodium hydroxide, and this hydrolyzed PAN (HPAN) fiber was used to prepare conductive composite fiber by silver particles. The electrically conductive PAN fiber has the usage potential to produce variety of materials such as antistatic materials, life jackets and static charge reducing products. We monitored the change in the weight loss values of the PAN fiber with hydrolysis time. It was observed that a 60 % of weight loss was obtained in the fiber weight after 7h hydrolysis under the investigated conditions, but the fiber lost its fibrous structure. The hydrolysis time of 5h was found to be suitable in terms of preserving its fibrous structure. The change in the conductivity values of the composite with the preparation conditions such as hydrolysis time, silver ion concentration was studied. PAN fibers with different degrees of hydrolysis were treated with aqueous solutions containing different concentrations of silver ions by continuous stirring at 20 oC for 30 min, and the composite having the maximum conductivity of 2 S/cm could be prepared. The antibacterial property of the conductive HPAN fibers participated silver was also investigated. While the hydrolysis of the PAN fiber was characterized with FTIR and SEM techniques, the silver reduction process of the HPAN fiber was investigated with SEM and TGA-DTA techniques. The SEM micrographs showed that the surface of HPAN fiber was rougher and much more corroded than that of the PAN fiber. Composite, Conducting polymer, Fiber, Polyacrylonitrile.Keywords: composite, conducting polymer, fiber, polyacrylonitrile
Procedia PDF Downloads 4795690 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection
Authors: Mahshid Arabi
Abstract:
With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.Keywords: data protection, digital technologies, information security, modern management
Procedia PDF Downloads 335689 Homogenization of a Non-Linear Problem with a Thermal Barrier
Authors: Hassan Samadi, Mustapha El Jarroudi
Abstract:
In this work, we consider the homogenization of a non-linear problem in periodic medium with two periodic connected media exchanging a heat flux throughout their common interface. The interfacial exchange coefficient λ is assumed to tend to zero or to infinity following a rate λ=λ(ε) when the size ε of the basic cell tends to zero. Three homogenized problems are determined according to some critical value depending of λ and ε. Our method is based on Γ-Convergence techniques.Keywords: variational methods, epiconvergence, homogenization, convergence technique
Procedia PDF Downloads 5255688 Contactless Electromagnetic Detection of Stress Fluctuations in Steel Elements
Authors: M. A. García, J. Vinolas, A. Hernando
Abstract:
Steel is nowadays one of the most important structural materials because of its outstanding mechanical properties. Therefore, in order to look for a sustainable economic model and to optimize the use of extensive resources, new methods to monitor and prevent failure of steel-based facilities are required. The classical mechanical tests, as for instance building tasting, are invasive and destructive. Moreover, for facilities where the steel element is embedded, (as reinforced concrete) these techniques are directly non applicable. Hence, non-invasive monitoring techniques to prevent failure, without altering the structural properties of the elements are required. Among them, electromagnetic methods are particularly suitable for non-invasive inspection of the mechanical state of steel-based elements. The magnetoelastic coupling effects induce a modification of the electromagnetic properties of an element upon applied stress. Since most steels are ferromagnetic because of their large Fe content, it is possible to inspect their structure and state in a non-invasive way. We present here a distinct electromagnetic method for contactless evaluation of internal stress in steel-based elements. In particular, this method relies on measuring the magnetic induction between two coils with the steel specimen in between them. We found that the alteration of electromagnetic properties of the steel specimen induced by applied stress-induced changes in the induction allowed us to detect stress well below half of the elastic limit of the material. Hence, it represents an outstanding non-invasive method to prevent failure in steel-based facilities. We here describe the theoretical model, present experimental results to validate it and finally we show a practical application for detection of stress and inhomogeneities in train railways.Keywords: magnetoelastic, magnetic induction, mechanical stress, steel
Procedia PDF Downloads 515687 Design, Synthesis and Evaluation of 4-(Phenylsulfonamido)Benzamide Derivatives as Selective Butyrylcholinesterase Inhibitors
Authors: Sushil Kumar Singh, Ashok Kumar, Ankit Ganeshpurkar, Ravi Singh, Devendra Kumar
Abstract:
In spectrum of neurodegenerative diseases, Alzheimer’s disease (AD) is characterized by the presence of amyloid β plaques and neurofibrillary tangles in the brain. It results in cognitive and memory impairment due to loss of cholinergic neurons, which is considered to be one of the contributing factors. Donepezil, an acetylcholinesterase (AChE) inhibitor which also inhibits butyrylcholinesterase (BuChE) and improves the memory and brain’s cognitive functions, is the most successful and prescribed drug to treat the symptoms of AD. The present work is based on designing of the selective BuChE inhibitors using computational techniques. In this work, machine learning models were trained using classification algorithms followed by screening of diverse chemical library of compounds. The various molecular modelling and simulation techniques were used to obtain the virtual hits. The amide derivatives of 4-(phenylsulfonamido) benzoic acid were synthesized and characterized using 1H & 13C NMR, FTIR and mass spectrometry. The enzyme inhibition assays were performed on equine plasma BuChE and electric eel’s AChE by method developed by Ellman et al. Compounds 31, 34, 37, 42, 49, 52 and 54 were found to be active against equine BuChE. N-(2-chlorophenyl)-4-(phenylsulfonamido)benzamide and N-(2-bromophenyl)-4-(phenylsulfonamido)benzamide (compounds 34 and 37) displayed IC50 of 61.32 ± 7.21 and 42.64 ± 2.17 nM against equine plasma BuChE. Ortho-substituted derivatives were more active against BuChE. Further, the ortho-halogen and ortho-alkyl substituted derivatives were found to be most active among all with minimal AChE inhibition. The compounds were selective toward BuChE.Keywords: Alzheimer disease, butyrylcholinesterase, machine learning, sulfonamides
Procedia PDF Downloads 1405686 Appropriation of Cryptocurrencies as a Payment Method by South African Retailers
Authors: Neliswa Dyosi
Abstract:
Purpose - Using an integrated Technology-Organization-Environment (TOE) framework and the model of technology appropriation (MTA) as a theoretical lens, this interpretive qualitative study seeks to understand and explain the factors that influence the appropriation, non-appropriation, and disappropriation of bitcoin as a payment method by South African retailers. Design/methodology/approach –The study adopts the interpretivist philosophical paradigm. Multiple case studies will be adopted as a research strategy. For data collection, the study follows a qualitative approach. Qualitative data will be collected from the six retailers in various industries. Semi-structured interviews and documents will be used as the data collection techniques. Purposive and snowballing sampling techniques will be used to identify participants within the organizations. Data will be analyzed using thematic analysis. Originality/value - Using the deduction approach, the study seeks to provide a descriptive and explanatory contribution to theory. The study contributes to theory development by integrating the MTA and TOE frameworks as a means to understand technology adoption behaviors of organizations, in this case, retailers. This is also the first study that looks at an integrated approach of the Technology-Organization-Environment (TOE) framework and the MTA framework to understand the adoption and use of a payment method. South Africa is ranked amongst the top ten countries in the world on cryptocurrency adoption. There is, however, still a dearth of literature on the current state of adoption and usage of bitcoin as a payment method in South Africa. The study will contribute to the existing literature as bitcoin cryptocurrency is gaining popularity as an alternative payment method across the globe.Keywords: cryptocurrency, bitcoin, payment methods, blockchain, appropriation, online retailers, TOE framework, disappropriation, non-appropriation
Procedia PDF Downloads 1375685 Land Cover, Land Surface Temperature, and Urban Heat Island Effects in Tropical Sub Saharan City of Accra
Authors: Eric Mensah
Abstract:
The effects of rapid urbanisation of tropical sub-Saharan developing cities on local and global climate are of great concern due to the negative impacts of Urban Heat Island (UHI) effects. The importance of urban parks, vegetative cover and forest reserves in these tropical cities have been undervalued with a rapid degradation and loss of these vegetative covers to urban developments which continue to cause an increase in daily mean temperatures and changes to local climatic conditions. Using Landsat data of the same months and period intervals, the spatial variations of land cover changes, temperature, and vegetation were examined to determine how vegetation improves local temperature and the effects of urbanisation on daily mean temperatures over the past 12 years. The remote sensing techniques of maximum likelihood supervised classification, land surface temperature retrieval technique, and normalised differential vegetation index techniques were used to analyse and create the land use land cover (LULC), land surface temperature (LST), and vegetation and non-vegetation cover maps respectively. Results from the study showed an increase in daily mean temperature by 0.80 °C as a result of rapid increase in urban area by 46.13 sq. km and loss of vegetative cover by 46.24 sq. km between 2005 and 2017. The LST map also shows the existence of UHI within the urban areas of Accra, the potential mitigating effects offered by the existence of forest and vegetative cover as demonstrated by the existence of cool islands around the Achimota ecological forest and University of Ghana botanical gardens areas.Keywords: land surface temperature, climate, remote sensing, urbanisation
Procedia PDF Downloads 3215684 Predicting the Human Impact of Natural Onset Disasters Using Pattern Recognition Techniques and Rule Based Clustering
Authors: Sara Hasani
Abstract:
This research focuses on natural sudden onset disasters characterised as ‘occurring with little or no warning and often cause excessive injuries far surpassing the national response capacities’. Based on the panel analysis of the historic record of 4,252 natural onset disasters between 1980 to 2015, a predictive method was developed to predict the human impact of the disaster (fatality, injured, homeless) with less than 3% of errors. The geographical dispersion of the disasters includes every country where the data were available and cross-examined from various humanitarian sources. The records were then filtered into 4252 records of the disasters where the five predictive variables (disaster type, HDI, DRI, population, and population density) were clearly stated. The procedure was designed based on a combination of pattern recognition techniques and rule-based clustering for prediction and discrimination analysis to validate the results further. The result indicates that there is a relationship between the disaster human impact and the five socio-economic characteristics of the affected country mentioned above. As a result, a framework was put forward, which could predict the disaster’s human impact based on their severity rank in the early hours of disaster strike. The predictions in this model were outlined in two worst and best-case scenarios, which respectively inform the lower range and higher range of the prediction. A necessity to develop the predictive framework can be highlighted by noticing that despite the existing research in literature, a framework for predicting the human impact and estimating the needs at the time of the disaster is yet to be developed. This can further be used to allocate the resources at the response phase of the disaster where the data is scarce.Keywords: disaster management, natural disaster, pattern recognition, prediction
Procedia PDF Downloads 1545683 Competing Risks Modeling Using within Node Homogeneity Classification Tree
Authors: Kazeem Adesina Dauda, Waheed Babatunde Yahya
Abstract:
To design a tree that maximizes within-node homogeneity, there is a need for a homogeneity measure that is appropriate for event history data with multiple risks. We consider the use of Deviance and Modified Cox-Snell residuals as a measure of impurity in Classification Regression Tree (CART) and compare our results with the results of Fiona (2008) in which homogeneity measures were based on Martingale Residual. Data structure approach was used to validate the performance of our proposed techniques via simulation and real life data. The results of univariate competing risk revealed that: using Deviance and Cox-Snell residuals as a response in within node homogeneity classification tree perform better than using other residuals irrespective of performance techniques. Bone marrow transplant data and double-blinded randomized clinical trial, conducted in other to compare two treatments for patients with prostate cancer were used to demonstrate the efficiency of our proposed method vis-à-vis the existing ones. Results from empirical studies of the bone marrow transplant data showed that the proposed model with Cox-Snell residual (Deviance=16.6498) performs better than both the Martingale residual (deviance=160.3592) and Deviance residual (Deviance=556.8822) in both event of interest and competing risks. Additionally, results from prostate cancer also reveal the performance of proposed model over the existing one in both causes, interestingly, Cox-Snell residual (MSE=0.01783563) outfit both the Martingale residual (MSE=0.1853148) and Deviance residual (MSE=0.8043366). Moreover, these results validate those obtained from the Monte-Carlo studies.Keywords: within-node homogeneity, Martingale residual, modified Cox-Snell residual, classification and regression tree
Procedia PDF Downloads 2735682 Erector Spine Plane Block versus Para Vertebral Block in Brest Surgery
Authors: Widad Kouachi, Nacera Benmouhoub
Abstract:
Background: Erector spinae plane block (ESP) and thoracic paravertebral block (PVB) are two widely used regional anesthesia techniques in breast cancer surgery. Both techniques aim to improve postoperative pain management and reduce opioid consumption. However, comparative data on their efficacy in oncologic breast surgery remains limited. Objectives: This study aims to compare the efficacy of ESP and PVB in postoperative pain control, patient satisfaction, and opioid consumption in breast cancer surgery. Methods: A randomized, double-blind trial was conducted involving 100 patients undergoing oncologic breast surgery. Patients were randomly assigned to two groups: 50 received ESP, and 50 received PVB. Postoperative pain scores (at rest and during movement), opioid consumption, patient satisfaction, and hospital length of stay were recorded and analyzed. Results: Both ESP and PVB provided effective postoperative analgesia. No significant difference in pain scores was observed between the two groups within the first 24 hours. However, ESP showed a notable advantage in managing chronic postoperative pain at the 6-month follow-up. Opioid consumption was lower in both groups compared to patients without a block. No significant differences in complication rates or hospital stay were noted between the groups. Conclusion: ESP and PVB offer comparable efficacy for immediate postoperative pain control in breast cancer surgery. Nevertheless, ESP may have a superior role in managing long-term pain. Further research is needed to explore the mechanisms behind the observed differences in chronic pain outcomes.Keywords: pain assessment, brest surgery, bpv block, ESP block
Procedia PDF Downloads 32