Search results for: specific emitter identification
7205 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.Keywords: analytics, digitization, industry 4.0, manufacturing
Procedia PDF Downloads 1167204 Simultaneous Targeting of MYD88 and Nur77 as an Effective Approach for the Treatment of Inflammatory Diseases
Authors: Uzma Saqib, Mirza S. Baig
Abstract:
Myeloid differentiation primary response protein 88 (MYD88) has long been considered a central player in the inflammatory pathway. Recent studies clearly suggest that it is an important therapeutic target in inflammation. On the other hand, a recent study on the interaction between the orphan nuclear receptor (Nur77) and p38α, leading to increased lipopolysaccharide-induced hyperinflammatory response, suggests this binary complex as a therapeutic target. In this study, we have designed inhibitors that can inhibit both MYD88 and Nur77 at the same time. Since both MYD88 and Nur77 are an integral part of the pathways involving lipopolysaccharide-induced activation of NF-κB-mediated inflammation, we tried to target both proteins with the same library in order to retrieve compounds having dual inhibitory properties. To perform this, we developed a homodimeric model of MYD88 and, along with the crystal structure of Nur77, screened a virtual library of compounds from the traditional Chinese medicine database containing ~61,000 compounds. We analyzed the resulting hits for their efficacy for dual binding and probed them for developing a common pharmacophore model that could be used as a prototype to screen compound libraries as well as to guide combinatorial library design to search for ideal dual-target inhibitors. Thus, our study explores the identification of novel leads having dual inhibiting effects due to binding to both MYD88 and Nur77 targets.Keywords: drug design, Nur77, MYD88, inflammation
Procedia PDF Downloads 3087203 Mining the Proteome of Fusobacterium nucleatum for Potential Therapeutics Discovery
Authors: Abdul Musaweer Habib, Habibul Hasan Mazumder, Saiful Islam, Sohel Sikder, Omar Faruk Sikder
Abstract:
The plethora of genome sequence information of bacteria in recent times has ushered in many novel strategies for antibacterial drug discovery and facilitated medical science to take up the challenge of the increasing resistance of pathogenic bacteria to current antibiotics. In this study, we adopted subtractive genomics approach to analyze the whole genome sequence of the Fusobacterium nucleatum, a human oral pathogen having association with colorectal cancer. Our study divulged 1499 proteins of Fusobacterium nucleatum, which has no homolog in human genome. These proteins were subjected to screening further by using the Database of Essential Genes (DEG) that resulted in the identification of 32 vitally important proteins for the bacterium. Subsequent analysis of the identified pivotal proteins, using the KEGG Automated Annotation Server (KAAS) resulted in sorting 3 key enzymes of F. nucleatum that may be good candidates as potential drug targets, since they are unique for the bacterium and absent in humans. In addition, we have demonstrated the 3-D structure of these three proteins. Finally, determination of ligand binding sites of the key proteins as well as screening for functional inhibitors that best fitted with the ligands sites were conducted to discover effective novel therapeutic compounds against Fusobacterium nucleatum.Keywords: colorectal cancer, drug target, Fusobacterium nucleatum, homology modeling, ligands
Procedia PDF Downloads 3937202 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure
Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer
Abstract:
The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition
Procedia PDF Downloads 1147201 Comparative Study of Various Treatment Positioning Technique: A Site Specific Study-CA. Breast
Authors: Kamal Kaushik, Dandpani Epili, Ajay G. V., Ashutosh, S. Pradhaan
Abstract:
Introduction: Radiation therapy has come a long way over a period of decades, from 2-dimensional radiotherapy to intensity-modulated radiation therapy (IMRT) or VMAT. For advanced radiation therapy, we need better patient position reproducibility to deliver precise and quality treatment, which raises the need for better image guidance technologies for precise patient positioning. This study presents a two tattoo simulation with roll correction technique which is comparable to other advanced patient positioning techniques. Objective: This is a site-specific study is aimed to perform a comparison between various treatment positioning techniques used for the treatment of patients of Ca- Breast undergoing radiotherapy. In this study, we are comparing 5 different positioning methods used for the treatment of ca-breast, namely i) Vacloc with 3 tattoos, ii) Breast board with three tattoos, iii) Thermoplastic cast with three fiducials, iv) Breast board with a thermoplastic mask with 3 tattoo, v) Breast board with 2 tattoos – A roll correction method. Methods and material: All in one (AIO) solution immobilization was used in all patient positioning techniques for immobilization. The process of two tattoo simulations includes positioning of the patient with the help of a thoracic-abdomen wedge, armrest & knee rest. After proper patient positioning, we mark two tattoos on the treatment side of the patient. After positioning, place fiducials as per the clinical borders markers (1) sternum notch (lower border of clavicle head) (2) 2 cm below from contralateral breast (3) midline between 1 & 2 markers (4) mid axillary on the same axis of 3 markers (Marker 3 & 4 should be on the same axis). During plan implementation, a roll depth correction is applied as per the anterior and lateral positioning tattoos, followed by the shifts required for the Isocentre position. The shifts are then verified by SSD on the patient surface followed by radiographic verification using Cone Beam Computed Tomography (CBCT). Results: When all the five positioning techniques were compared all together, the produced shifts in Vertical, Longitudinal and lateral directions are as follows. The observations clearly suggest that the Longitudinal average shifts in two tattoo roll correction techniques are less than every other patient positioning technique. Vertical and lateral Shifts are also comparable to other modern positioning techniques. Concluded: The two tattoo simulation with roll correction technique provides us better patient setup with a technique that can be implemented easily in most of the radiotherapy centers across the developing nations where 3D verification techniques are not available along with delivery units as the shifts observed are quite minimal and are comparable to those with Vacloc and modern amenities.Keywords: Ca. breast, breast board, roll correction technique, CBCT
Procedia PDF Downloads 1407200 Methodological Analysis and Exploration of Feminist Planning Research in the Field of Urban and Rural Planning
Authors: Xi Zuo
Abstract:
As a part of the urban population that cannot be ignored, women have long been less involved in urban planning due to socio-economic constraints. Urban planning and development have long been influenced by the mainstream "male standard," paying less attention to women's needs for space in the city. However, with the development of the economy and society and the improvement of women's social status, their participation in urban life is gradually increasing, and their needs for the city are diversifying. Therefore, different scholars, planning designers and governmental departments have explored this field in different degrees and directions. This paper summarizes the research on urban planning from women's perspectives and, discusses its strengths, weaknesses, and methodology with specific case studies, and then further discusses the direction of further research on this topic.Keywords: urban planning, feminism, methodology, gender
Procedia PDF Downloads 877199 Extraction and Identification of Natural Antioxidants from Liquorices (Glycyrrhiza glabra) and Carob (Ceratonia siliqua) and Its Application in El-Mewled El-Nabawy Sweets (Sesames and Folia)
Authors: Mervet A. El-sherif, Ginat M El-sherif, Kadry H Tolba
Abstract:
The objective of this study was to determine, identify and investigate the effects of natural antioxidants of licorice and carob. Besides, their effects as powder and antioxidant extracts addition on refined sunflower oil stability as natural antioxidants were evaluated. Total polyphenol contents as total phenols, total carotenoids and total tannins were 353.93mg/100g (gallic acid), 10.62mg/100g (carotenoids) and 83.33mg/100g (tannic acid), respectively in licorice, while in carob, it was 186.07, 18.66 and 106.67, respectively. Polyphenol compounds of the studied licorice and carob extracts were determined and identified by HPLC. The stability of refined sunflower oil (which determined by peroxide value and Rancimat) was increased with increasing the level of polyphenols extracts addition. Also, our study shows the effect of addition of these polyphenols extracts to El-mewled El-nabawy sweets fortified by full cream milk powder (sesames and folia). We found that, licorice and carob as powder and polyphenols extracts were delayed the rancidity of sesame and peanut significantly. That encourages using licorice and carob as powder and polyphenols extracts as a good natural antioxidants source instead of synthetic antioxidants.Keywords: licorice, carob, natural antioxidants, antioxidant activity, applications
Procedia PDF Downloads 4397198 LLM-Powered User-Centric Knowledge Graphs for Unified Enterprise Intelligence
Authors: Rajeev Kumar, Harishankar Kumar
Abstract:
Fragmented data silos within enterprises impede the extraction of meaningful insights and hinder efficiency in tasks such as product development, client understanding, and meeting preparation. To address this, we propose a system-agnostic framework that leverages large language models (LLMs) to unify diverse data sources into a cohesive, user-centered knowledge graph. By automating entity extraction, relationship inference, and semantic enrichment, the framework maps interactions, behaviors, and data around the user, enabling intelligent querying and reasoning across various data types, including emails, calendars, chats, documents, and logs. Its domain adaptability supports applications in contextual search, task prioritization, expertise identification, and personalized recommendations, all rooted in user-centric insights. Experimental results demonstrate its effectiveness in generating actionable insights, enhancing workflows such as trip planning, meeting preparation, and daily task management. This work advances the integration of knowledge graphs and LLMs, bridging the gap between fragmented data systems and intelligent, unified enterprise solutions focused on user interactions.Keywords: knowledge graph, entity extraction, relation extraction, LLM, activity graph, enterprise intelligence
Procedia PDF Downloads 147197 Knowledge Management: Why is So Difficult? From “A Good Idea” to Organizational Contribute
Authors: Lisandro Blas, Héctor Tamanini
Abstract:
From earliest 90 to now, no many companies or organization can “really” implement a knowledge management (KM) system that works (no only viewed from a measurement model, but in this continuity). Which are the reasons of that? Some of the reason maybe could be embedded in how KM is demanded (usefulness, priority, experts, a definition of KM) vs the importance and resources that the organizations afford (budget, responsible of a specific area of KM, intangibility). Many organizations “claim” the importance of Knowledge Management but thhese demands are not reflecting these claims in their future actions. With another’s tools or managements ideas the organizations put the economics and human resources to work. Why it´s not occur in KM? This paper tray to explain some of this reasons and tray to deal with this situations through a survey done in 2011 for a IAPG (Argentinean Institute from Oil & Gas) Congress.Keywords: knowledge management into organizations, new perspectives, failure in implementation, claim
Procedia PDF Downloads 4257196 Digital Image Correlation: Metrological Characterization in Mechanical Analysis
Authors: D. Signore, M. Ferraiuolo, P. Caramuta, O. Petrella, C. Toscano
Abstract:
The Digital Image Correlation (DIC) is a newly developed optical technique that is spreading in all engineering sectors because it allows the non-destructive estimation of the entire surface deformation without any contact with the component under analysis. These characteristics make the DIC very appealing in all the cases the global deformation state is to be known without using strain gages, which are the most used measuring device. The DIC is applicable to any material subjected to distortion caused by either thermal or mechanical load, allowing to obtain high-definition mapping of displacements and deformations. That is why in the civil and the transportation industry, DIC is very useful for studying the behavior of metallic materials as well as of composite materials. DIC is also used in the medical field for the characterization of the local strain field of the vascular tissues surface subjected to uniaxial tensile loading. DIC can be carried out in the two dimension mode (2D DIC) if a single camera is used or in a three dimension mode (3D DIC) if two cameras are involved. Each point of the test surface framed by the cameras can be associated with a specific pixel of the image, and the coordinates of each point are calculated knowing the relative distance between the two cameras together with their orientation. In both arrangements, when a component is subjected to a load, several images related to different deformation states can be are acquired through the cameras. A specific software analyzes the images via the mutual correlation between the reference image (obtained without any applied load) and those acquired during the deformation giving the relative displacements. In this paper, a metrological characterization of the digital image correlation is performed on aluminum and composite targets both in static and dynamic loading conditions by comparison between DIC and strain gauges measures. In the static test, interesting results have been obtained thanks to an excellent agreement between the two measuring techniques. In addition, the deformation detected by the DIC is compliant with the result of a FEM simulation. In the dynamic test, the DIC was able to follow with a good accuracy the periodic deformation of the specimen giving results coherent with the ones given by FEM simulation. In both situations, it was seen that the DIC measurement accuracy depends on several parameters such as the optical focusing, the parameters chosen to perform the mutual correlation between the images and, finally, the reference points on image to be analyzed. In the future, the influence of these parameters will be studied, and a method to increase the accuracy of the measurements will be developed in accordance with the requirements of the industries especially of the aerospace one.Keywords: accuracy, deformation, image correlation, mechanical analysis
Procedia PDF Downloads 3147195 Dexamethasone Treatment Deregulates Proteoglycans Expression in Normal Brain Tissue
Authors: A. Y. Tsidulko, T. M. Pankova, E. V. Grigorieva
Abstract:
High-grade gliomas are the most frequent and most aggressive brain tumors which are characterized by active invasion of tumor cells into the surrounding brain tissue, where the extracellular matrix (ECM) plays a crucial role. Disruption of ECM can be involved in anticancer drugs effectiveness, side-effects and also in tumor relapses. The anti-inflammatory agent dexamethasone is a common drug used during high-grade glioma treatment for alleviating cerebral edema. Although dexamethasone is widely used in the clinic, its effects on normal brain tissue ECM remain poorly investigated. It is known that proteoglycans (PGs) are a major component of the extracellular matrix in the central nervous system. In our work, we studied the effects of dexamethasone on the ECM proteoglycans (syndecan-1, glypican-1, perlecan, versican, brevican, NG2, decorin, biglican, lumican) using RT-PCR in the experimental animal model. It was shown that proteoglycans in rat brain have age-specific expression patterns. In early post-natal rat brain (8 days old rat pups) overall PGs expression was quite high and mainly expressed PGs were biglycan, decorin, and syndecan-1. The overall transcriptional activity of PGs in adult rat brain is 1.5-fold decreased compared to post-natal brain. The expression pattern was changed as well with biglycan, decorin, syndecan-1, glypican-1 and brevican becoming almost equally expressed. PGs expression patterns create a specific tissue microenvironment that differs in developing and adult brain. Dexamethasone regimen close to the one used in the clinic during high-grade glioma treatment significantly affects proteoglycans expression. It was shown that overall PGs transcription activity is 1.5-2-folds increased after dexamethasone treatment. The most up-regulated PGs were biglycan, decorin, and lumican. The PGs expression pattern in adult brain changed after treatment becoming quite close to the expression pattern in developing brain. It is known that microenvironment in developing tissues promotes cells proliferation while in adult tissues proliferation is usually suppressed. The changes occurring in the adult brain after dexamethasone treatment may lead to re-activation of cell proliferation due to signals from changed microenvironment. Taken together obtained data show that dexamethasone treatment significantly affects the normal brain ECM, creating the appropriate microenvironment for tumor cells proliferation and thus can reduce the effectiveness of anticancer treatment and promote tumor relapses. This work has been supported by a Russian Science Foundation (RSF Grant 16-15-10243)Keywords: dexamthasone, extracellular matrix, glioma, proteoglycan
Procedia PDF Downloads 2017194 A Kernel-Based Method for MicroRNA Precursor Identification
Authors: Bin Liu
Abstract:
MicroRNAs (miRNAs) are small non-coding RNA molecules, functioning in transcriptional and post-transcriptional regulation of gene expression. The discrimination of the real pre-miRNAs from the false ones (such as hairpin sequences with similar stem-loops) is necessary for the understanding of miRNAs’ role in the control of cell life and death. Since both their small size and sequence specificity, it cannot be based on sequence information alone but requires structure information about the miRNA precursor to get satisfactory performance. Kmers are convenient and widely used features for modeling the properties of miRNAs and other biological sequences. However, Kmers suffer from the inherent limitation that if the parameter K is increased to incorporate long range effects, some certain Kmer will appear rarely or even not appear, as a consequence, most Kmers absent and a few present once. Thus, the statistical learning approaches using Kmers as features become susceptible to noisy data once K becomes large. In this study, we proposed a Gapped k-mer approach to overcome the disadvantages of Kmers, and applied this method to the field of miRNA prediction. Combined with the structure status composition, a classifier called imiRNA-GSSC was proposed. We show that compared to the original imiRNA-kmer and alternative approaches. Trained on human miRNA precursors, this predictor can achieve an accuracy of 82.34 for predicting 4022 pre-miRNA precursors from eleven species.Keywords: gapped k-mer, imiRNA-GSSC, microRNA precursor, support vector machine
Procedia PDF Downloads 1657193 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1127192 Harmony Search-Based K-Coverage Enhancement in Wireless Sensor Networks
Authors: Shaimaa M. Mohamed, Haitham S. Hamza, Imane A. Saroit
Abstract:
Many wireless sensor network applications require K-coverage of the monitored area. In this paper, we propose a scalable harmony search based algorithm in terms of execution time, K-Coverage Enhancement Algorithm (KCEA), it attempts to enhance initial coverage, and achieve the required K-coverage degree for a specific application efficiently. Simulation results show that the proposed algorithm achieves coverage improvement of 5.34% compared to K-Coverage Rate Deployment (K-CRD), which achieves 1.31% when deploying one additional sensor. Moreover, the proposed algorithm is more time efficient.Keywords: Wireless Sensor Networks (WSN), harmony search algorithms, K-Coverage, Mobile WSN
Procedia PDF Downloads 5307191 Changes in Pain Intensity of Musculoskeletal Disorders in Flight Attendants after Stretching Exercise Program
Authors: Maria Melania Muda, Retno Wibawanti, Retno Asti Werdhani
Abstract:
Background: Flight attendant (FA) is a job that is often exposed to ergonomic stressors; thus, they are very susceptible to symptoms of musculoskeletal disorders (MSDs). One of the ways to overcome musculoskeletal complaints is by stretching. This study aimed to examine the prevalence of MSDs and the effect of a 2-week stretching exercise program using the Indonesian Ministry of Health's stretching video on changes in musculoskeletal pain intensity in FA on commercial aircraft in Indonesia. Methods: A pre-post study was conducted using Nordic Musculoskeletal Questionnaire (NMQ) for MSDs’ identification and Visual Analog Scale (VAS) as pain intensity measurement. Data was collected and then analyzed using SPSS with Wilcoxon test. The change in pain intensity was considered significant if the p value was less than 0.05. Results: The results showed that 92% of the FA (n=75) had MSDs in at least 1 area of the body in the last 12 months. Thirty-four respondents participated as subjects. The complaint level score in 28 body areas before intervention was a median of 34 (29-84), with pain intensity of a median of 6 (2-9) became a median of 32 (28-67) and a median of 3 (0-9) after the intervention, respectively, with p-value <0.001. Conclusion: The stretching exercise program showed significant changes in the complaint level scores in 28 body areas (p < 0.001) and pain intensity before and after the stretching exercise intervention (p < 0.001).Keywords: flight attendant, MSDs, Nordic Musculoskeletal Questionnaire, stretching exercise program, visual analog scale
Procedia PDF Downloads 867190 Investigating the Invalidity of the Law of Energy Conservation Based on Waves Interference Phenomenon Inside a Ringed Waveguide
Authors: M. Yusefzad
Abstract:
Law of energy conservation is one of the fundamental laws of physics. Energy is conserved, and the total amount of energy is constant. It can be transferred from one object to another and changed from one state to another. However, in the case of wave interference, this law faces important contradictions. Based on the presented mathematical relationship in this paper, it seems that validity of this law depends on the path of energy wave, like light, in which it is located. In this paper, by using some fundamental concepts in physics like the constancy of the electromagnetic wave speed in a specific media and wave theory of light, it will be shown that law of energy conservation is not valid in every condition and in some circumstances, it is possible to increase energy of a system with a determined amount of energy without any input.Keywords: power, law of energy conservation, electromagnetic wave, interference, Maxwell’s equations
Procedia PDF Downloads 2727189 Realization of a (GIS) for Drilling (DWS) through the Adrar Region
Authors: Djelloul Benatiallah, Ali Benatiallah, Abdelkader Harouz
Abstract:
Geographic Information Systems (GIS) include various methods and computer techniques to model, capture digitally, store, manage, view and analyze. Geographic information systems have the characteristic to appeal to many scientific and technical field, and many methods. In this article we will present a complete and operational geographic information system, following the theoretical principles of data management and adapting to spatial data, especially data concerning the monitoring of drinking water supply wells (DWS) Adrar region. The expected results of this system are firstly an offer consulting standard features, updating and editing beneficiaries and geographical data, on the other hand, provides specific functionality contractors entered data, calculations parameterized and statistics.Keywords: GIS, DWS, drilling, Adrar
Procedia PDF Downloads 3127188 A Survey of Semantic Integration Approaches in Bioinformatics
Authors: Chaimaa Messaoudi, Rachida Fissoune, Hassan Badir
Abstract:
Technological advances of computer science and data analysis are helping to provide continuously huge volumes of biological data, which are available on the web. Such advances involve and require powerful techniques for data integration to extract pertinent knowledge and information for a specific question. Biomedical exploration of these big data often requires the use of complex queries across multiple autonomous, heterogeneous and distributed data sources. Semantic integration is an active area of research in several disciplines, such as databases, information-integration, and ontology. We provide a survey of some approaches and techniques for integrating biological data, we focus on those developed in the ontology community.Keywords: biological ontology, linked data, semantic data integration, semantic web
Procedia PDF Downloads 4537187 A Framework for Event-Based Monitoring of Business Processes in the Supply Chain Management of Industry 4.0
Authors: Johannes Atug, Andreas Radke, Mitchell Tseng, Gunther Reinhart
Abstract:
In modern supply chains, large numbers of SKU (Stock-Keeping-Unit) need to be timely managed, and any delays in noticing disruptions of items often limit the ability to defer the impact on customer order fulfillment. However, in supply chains of IoT-connected enterprises, the ERP (Enterprise-Resource-Planning), the MES (Manufacturing-Execution-System) and the SCADA (Supervisory-Control-and-Data-Acquisition) systems generate large amounts of data, which generally glean much earlier notice of deviations in the business process steps. That is, analyzing these streams of data with process mining techniques allows the monitoring of the supply chain business processes and thus identification of items that deviate from the standard order fulfillment process. In this paper, a framework to enable event-based SCM (Supply-Chain-Management) processes including an overview of core enabling technologies are presented, which is based on the RAMI (Reference-Architecture-Model for Industrie 4.0) architecture. The application of this framework in the industry is presented, and implications for SCM in industry 4.0 and further research are outlined.Keywords: cyber-physical production systems, event-based monitoring, supply chain management, RAMI (Reference-Architecture-Model for Industrie 4.0)
Procedia PDF Downloads 2417186 Research Trends in High Voltage Power Transmission
Authors: Tlotlollo Sidwell Hlalele, Shengzhi Du
Abstract:
High voltage transmission is the most pivotal process in the electrical power industry. It requires a robust infrastructure that can last for decades without causing impairment in human life. Due to the so-called global warming, power transmission system has started to experience some challenges which could presumably escalate more in future. These challenges are earthquake resistance, transmission power losses, and high electromagnetic field. In this paper, research efforts aim to address these challenges are discussed. We focus in particular on the research in regenerative electric energy such as: wind, hydropower, biomass and sea-waves based on the energy storage and transmission possibility. We conclude by drawing attention to specific areas that we believe need more research.Keywords: power transmission, regenerative energy, power quality, energy storage
Procedia PDF Downloads 3557185 Preparation and Evaluation of Poly(Ethylene Glycol)-B-Poly(Caprolactone) Diblock Copolymers with Zwitterionic End Group for Thermo-Responsive Properties
Authors: Bo Keun Lee, Doo Yeon Kwon, Ji Hoon Park, Gun Hee Lee, Ji Hye Baek, Heung Jae Chun, Young Joo Koh, Moon Suk Kim
Abstract:
Thermo-responsive materials are viscoelastic materials that undergo a sol-to-gel phase transition at a specific temperature and many materials have been developed. MPEG-b-PCL (MPC) as a thermo-responsive material contained hydrophilic and hydrophobic segments and it formed an ordered crystalline structure of hydrophobic PCL segments in aqueous solutions. The ordered crystalline structure packed tightly or aggregated and finally induced an aggregated gel through intra- and inter-molecular interactions as a function of temperature. Thus, we introduced anionic and cationic groups into the end positions of the PCL chain to alter the hydrophobicity of the PCL segment. Introducing anionic and cationic groups into the PCL end position altered their solubility by changing the crystallinity and hydrophobicity of the PCL block domains. These results indicated that the properties of the end group in the hydrophobic PCL blockand the balance between hydrophobicity and hydrophilicity affect thermo-responsivebehavior of the copolymers in aqueous solutions. Thus, we concluded that determinant of the temperature-dependent thermo-responsive behavior of MPC depend on the ionic end group in the PCL block. So, we introduced zwitterionic end groups to investigate the thermo-responsive behavior of MPC. Methoxypoly(ethylene oxide) and ε-caprolactone (CL) were randomly copolymerized that introduced varying hydrophobic PCL lengths and an MPC featuring a zwitterionic sulfobetaine (MPC-ZW) at the chain end of the PCL segment. The MPC and MPC-ZW copolymers were obtained formed sol-state at room temperature when prepared as 20-wt% aqueous solutions. The solubility of MPC decreased when the PCL block was increased from molecular weight. The solubilization time of MPC-2.4k was around 20 min and MPC-2.8k, MPC-3.0k increased to 30 min and 1 h, respectively. MPC-3.6k was not solubilized. In case of MPC-ZW 3.6k, However, the zwitterion-modified MPC copolymers were solubilized in 3–5 min. This result indicates that the zwitterionic end group of the MPC-ZW diblock copolymer increased the aqueous solubility of the diblock copolymer even when the length of the hydrophobic PCL segment was increased. MPC and MPC-ZW diblock copolymers that featuring zwitterionic end groups were synthesized successfully. The sol-to-gel phase-transition was formed that specific temperature depend on the length of the PCL hydrophobic segments introduced and on the zwitterion groups attached to the MPC chain end. This result indicated that the zwitterionic end groups reduced the hydrophobicity in the PCL block and changed the solubilization. The MPC-ZW diblock copolymer can be utilized as a potential injectable drug and cell carrier.Keywords: thermo-responsive material, zwitterionic, hydrophobic, crystallization, phase transition
Procedia PDF Downloads 5147184 The Psychological Significance of Cultural and Religious Values Among the Arab Population
Authors: Michel Mikhail
Abstract:
Introduction: Values, which are the guiding principles and beliefs of our lives, have an influence on one’s psychological health. This study aims to investigate how Schwartz’s four higher-order values (conservation, openness to change, self-transcendence, and self-enhancement) and religious values influence psychological health among the Arab population. Methods: A total of 1,023 respondents from nine Arab countries aged 18 to 71 filled out an online survey with measures of the following constructs: Schwartz’s four higher-order values (Portrait Value Questionnaire-21), religious values (Sahin’s Index of Islamic Moral Values), and general psychological health (General Health Questionnaire-28). Results: Two models of multiple regression were conducted to investigate the relationships between values and psychological health. Higher conservation, self-enhancement, and religious values were significantly associated with better psychological health, with conservation losing significance after adding religious values to the model. All of Schwartz’s four values were found to have a significant relationship with religious values. More self-enhancement and conservation values were associated with higher identification of religious values, and the opposite was true for the other two values. Conclusion: The findings challenged existing assumptions that conservation values relate negatively to psychological health. This finding could be explained by the congruence of conservation values and the Arab culture. The most powerful relationships were those of self-enhancement and religious values, both of which were positively associated with psychological health. As such, therapists should be aware to reconsider biases against religious or conservation values and rather pay attention to their potential positive influence over one’s psychological health.Keywords: counseling psychology, counseling and cultural values, counseling and religious values, psychotherapy and Arab values
Procedia PDF Downloads 597183 Natural Language Processing for the Classification of Social Media Posts in Post-Disaster Management
Authors: Ezgi Şendil
Abstract:
Information extracted from social media has received great attention since it has become an effective alternative for collecting people’s opinions and emotions based on specific experiences in a faster and easier way. The paper aims to put data in a meaningful way to analyze users’ posts and get a result in terms of the experiences and opinions of the users during and after natural disasters. The posts collected from Reddit are classified into nine different categories, including injured/dead people, infrastructure and utility damage, missing/found people, donation needs/offers, caution/advice, and emotional support, identified by using labelled Twitter data and four different machine learning (ML) classifiers.Keywords: disaster, NLP, postdisaster management, sentiment analysis
Procedia PDF Downloads 787182 Robust Control of a Dynamic Model of an F-16 Aircraft with Improved Damping through Linear Matrix Inequalities
Authors: J. P. P. Andrade, V. A. F. Campos
Abstract:
This work presents an application of Linear Matrix Inequalities (LMI) for the robust control of an F-16 aircraft through an algorithm ensuring the damping factor to the closed loop system. The results show that the zero and gain settings are sufficient to ensure robust performance and stability with respect to various operating points. The technique used is the pole placement, which aims to put the system in closed loop poles in a specific region of the complex plane. Test results using a dynamic model of the F-16 aircraft are presented and discussed.Keywords: F-16 aircraft, linear matrix inequalities, pole placement, robust control
Procedia PDF Downloads 3117181 A Study of Traffic Assignment Algorithms
Authors: Abdelfetah Laouzai, Rachid Ouafi
Abstract:
In a traffic network, users usually choose their way so that it reduces their travel time between pairs origin-destination. This behavior might seem selfish as it produces congestions in different parts of the network. The traffic assignment problem (TAP) models the interactions between congestion and user travel decisions to obtain vehicles flows over each axis of the traffic network. The resolution methods of TAP serve as a tool allows predicting users’ distribution, identifying congesting points and affecting the travelers’ behavior in the choice of their route in the network following dynamic data. In this article, we will present a review about specific resolution approach of TAP. A comparative analysis is carried out on those approaches so that it highlights the characteristics, advantages and disadvantages of each.Keywords: network traffic, travel decisions, approaches, traffic assignment, flows
Procedia PDF Downloads 4767180 Bridging the Gap between Obstetric and Colorectal Services after Obstetric Anal Sphincter Injuries
Authors: Shachi Joshi
Abstract:
Purpose: The primary aim of this study was to determine the prevalence of pelvic dysfunction symptoms following OASI. The secondary aim was to assess the scope of a dedicated perineal trauma clinic in identifying and investigating women that have experienced faecal incontinence after OASI and if a transitional clinic arrangement to colorectal surgeons would be useful. Methods: The clinical database was used to identify and obtain information about 118 women who sustained an OASI (3rd/ 4th degree tear) between August 2016 and July 2017. A questionnaire was designed to assess symptoms of pelvic dysfunction; this was sent via the post in November 2018. Results: The questionnaire was completed by 45 women (38%). Faecal incontinence was experienced by 42% (N=19), flatus incontinence by 47% (N=21), urinary incontinence by 76% (N=34), dyspareunia by 49% (N=22) and pelvic pain by 33% (N=15). Of the questionnaire respondents, only 62% (N=28) had attended a perineal trauma clinic appointment. 46% (N=13) of these women reported having experienced difficulty controlling flatus or faeces in the questionnaire, however, only 23% (N=3) of these reported ongoing symptoms at the time of clinic attendance and underwent an endoanal ultrasound scan. Conclusion: Pelvic dysfunction symptoms are highly prevalent following an OASI. Perineal trauma clinic attendance alone is not sufficient for identification and follow up of symptoms. Transitional care is needed between obstetric and colorectal teams, to recognize and treat women with ongoing faecal incontinence.Keywords: incontinence, obstetric anal sphincter, injury, repair
Procedia PDF Downloads 1147179 The Effective Use of the Network in the Distributed Storage
Authors: Mamouni Mohammed Dhiya Eddine
Abstract:
This work aims at studying the exploitation of high-speed networks of clusters for distributed storage. Parallel applications running on clusters require both high-performance communications between nodes and efficient access to the storage system. Many studies on network technologies led to the design of dedicated architectures for clusters with very fast communications between computing nodes. Efficient distributed storage in clusters has been essentially developed by adding parallelization mechanisms so that the server(s) may sustain an increased workload. In this work, we propose to improve the performance of distributed storage systems in clusters by efficiently using the underlying high-performance network to access distant storage systems. The main question we are addressing is: do high-speed networks of clusters fit the requirements of a transparent, efficient and high-performance access to remote storage? We show that storage requirements are very different from those of parallel computation. High-speed networks of clusters were designed to optimize communications between different nodes of a parallel application. We study their utilization in a very different context, storage in clusters, where client-server models are generally used to access remote storage (for instance NFS, PVFS or LUSTRE). Our experimental study based on the usage of the GM programming interface of MYRINET high-speed networks for distributed storage raised several interesting problems. Firstly, the specific memory utilization in the storage access system layers does not easily fit the traditional memory model of high-speed networks. Secondly, client-server models that are used for distributed storage have specific requirements on message control and event processing, which are not handled by existing interfaces. We propose different solutions to solve communication control problems at the filesystem level. We show that a modification of the network programming interface is required. Data transfer issues need an adaptation of the operating system. We detail several propositions for network programming interfaces which make their utilization easier in the context of distributed storage. The integration of a flexible processing of data transfer in the new programming interface MYRINET/MX is finally presented. Performance evaluations show that its usage in the context of both storage and other types of applications is easy and efficient.Keywords: distributed storage, remote file access, cluster, high-speed network, MYRINET, zero-copy, memory registration, communication control, event notification, application programming interface
Procedia PDF Downloads 2247178 Intra-miR-ExploreR, a Novel Bioinformatics Platform for Integrated Discovery of MiRNA:mRNA Gene Regulatory Networks
Authors: Surajit Bhattacharya, Daniel Veltri, Atit A. Patel, Daniel N. Cox
Abstract:
miRNAs have emerged as key post-transcriptional regulators of gene expression, however identification of biologically-relevant target genes for this epigenetic regulatory mechanism remains a significant challenge. To address this knowledge gap, we have developed a novel tool in R, Intra-miR-ExploreR, that facilitates integrated discovery of miRNA targets by incorporating target databases and novel target prediction algorithms, using statistical methods including Pearson and Distance Correlation on microarray data, to arrive at high confidence intragenic miRNA target predictions. We have explored the efficacy of this tool using Drosophila melanogaster as a model organism for bioinformatics analyses and functional validation. A number of putative targets were obtained which were also validated using qRT-PCR analysis. Additional features of the tool include downloadable text files containing GO analysis from DAVID and Pubmed links of literature related to gene sets. Moreover, we are constructing interaction maps of intragenic miRNAs, using both micro array and RNA-seq data, focusing on neural tissues to uncover regulatory codes via which these molecules regulate gene expression to direct cellular development.Keywords: miRNA, miRNA:mRNA target prediction, statistical methods, miRNA:mRNA interaction network
Procedia PDF Downloads 5157177 Qualitative Profiling in Practice: The Italian Public Employment Services Experience
Authors: L. Agneni, F. Carta, C. Micheletta, V. Tersigni
Abstract:
The development of a qualitative method to profile jobseekers is needed to improve the quality of the Public Employment Services (PES) in Italy. This is why the National Agency for Active Labour Market Policies (ANPAL) decided to introduce a Qualitative Profiling Service in the context of the activities carried out by local employment offices’ operators. The qualitative profiling service provides information and data regarding the jobseeker’s personal transition status, through a semi-structured questionnaire administered to PES clients during the guidance interview. The questionnaire responses allow PES staff to identify, for each client, proper activities and policy measures to support jobseekers in their reintegration into the labour market. Data and information gathered by the qualitative profiling tool are the following: frequency, modalities and motivations for clients to apply to local employment offices; clients’ expectations and skills; difficulties that they have faced during the previous working experiences; strategies, actions undertaken and activated channels for job search. These data are used to assess jobseekers’ personal and career characteristics and to measure their employability level (qualitative profiling index), in order to develop and deliver tailor-made action programmes for each client. This paper illustrates the use of the above-mentioned qualitative profiling service on the national territory and provides an overview of the main findings of the survey: concerning the difficulties that unemployed people face in finding a job and their perception of different aspects related to the transition in the labour market. The survey involved over 10.000 jobseekers registered with the PES. Most of them are beneficiaries of the “citizens' income”, a specific active labour policy and social inclusion measure. Furthermore, data analysis allows classifying jobseekers into a specific group of clients with similar features and behaviours, on the basis of socio-demographic variables, customers' expectations, needs and required skills for the profession for which they seek employment. Finally, the survey collects PES staff opinions and comments concerning clients’ difficulties in finding a new job and also their strengths. This is a starting point for PESs’ operators to define adequate strategies to facilitate jobseekers’ access or reintegration into the labour market.Keywords: labour market transition, public employment services, qualitative profiling, vocational guidance
Procedia PDF Downloads 1457176 Green Ports: Innovation Adopters or Innovation Developers
Authors: Marco Ferretti, Marcello Risitano, Maria Cristina Pietronudo, Lina Ozturk
Abstract:
A green port is the result of a sustainable long-term strategy adopted by an entire port infrastructure, therefore by the set of actors involved in port activities. The strategy aims to realise the development of sustainable port infrastructure focused on the reduction of negative environmental impacts without jeopardising economic growth. Green technology represents the core tool to implement sustainable solutions, however, they are not a magic bullet. Ports have always been integrated in the local territory affecting the environment in which they operate, therefore, the sustainable strategy should fit with the entire local systems. Therefore, adopting a sustainable strategy means to know how to involve and engage a wide stakeholders’ network (industries, production, markets, citizens, and public authority). The existing research on the topic has not well integrated this perspective with those of sustainability. Research on green ports have mixed the sustainability aspects with those on the maritime industry, neglecting dynamics that lead to the development of the green port phenomenon. We propose an analysis of green ports adopting the lens of ecosystem studies in the field of management. The ecosystem approach provides a way to model relations that enable green solutions and green practices in a port ecosystem. However, due to the local dimension of a port and the port trend on innovation, i.e., sustainable innovation, we draw to a specific concept of ecosystem, those on local innovation systems. More precisely, we explore if a green port is a local innovation system engaged in developing sustainable innovation with a large impact on the territory or merely an innovation adopter. To address this issue, we adopt a comparative case study selecting two innovative ports in Europe: Rotterdam and Genova. The case study is a research method focused on understanding the dynamics in a specific situation and can be used to provide a description of real circumstances. Preliminary results show two different approaches in supporting sustainable innovation: one represented by Rotterdam, a pioneer in competitiveness and sustainability, and the second one represented by Genoa, an example of technology adopter. The paper intends to provide a better understanding of how sustainable innovations are developed and in which manner a network of port and local stakeholder support this process. Furthermore, it proposes a taxonomy of green ports as developers and adopters of sustainable innovation, suggesting also best practices to model relationships that enable the port ecosystem in applying a sustainable strategy.Keywords: green port, innovation, sustainability, local innovation systems
Procedia PDF Downloads 125