Search results for: enriched semantic event chain
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3967

Search results for: enriched semantic event chain

2317 Quantitative Polymerase Chain Reaction Analysis of Phytoplankton Composition and Abundance to Assess Eutrophication: A Multi-Year Study in Twelve Large Rivers across the United States

Authors: Chiqian Zhang, Kyle D. McIntosh, Nathan Sienkiewicz, Ian Struewing, Erin A. Stelzer, Jennifer L. Graham, Jingrang Lu

Abstract:

Phytoplankton plays an essential role in freshwater aquatic ecosystems and is the primary group synthesizing organic carbon and providing food sources or energy to ecosystems. Therefore, the identification and quantification of phytoplankton are important for estimating and assessing ecosystem productivity (carbon fixation), water quality, and eutrophication. Microscopy is the current gold standard for identifying and quantifying phytoplankton composition and abundance. However, microscopic analysis of phytoplankton is time-consuming, has a low sample throughput, and requires deep knowledge and rich experience in microbial morphology to implement. To improve this situation, quantitative polymerase chain reaction (qPCR) was considered for phytoplankton identification and quantification. Using qPCR to assess phytoplankton composition and abundance, however, has not been comprehensively evaluated. This study focused on: 1) conducting a comprehensive performance comparison of qPCR and microscopy techniques in identifying and quantifying phytoplankton and 2) examining the use of qPCR as a tool for assessing eutrophication. Twelve large rivers located throughout the United States were evaluated using data collected from 2017 to 2019 to understand the relation between qPCR-based phytoplankton abundance and eutrophication. This study revealed that temporal variation of phytoplankton abundance in the twelve rivers was limited within years (from late spring to late fall) and among different years (2017, 2018, and 2019). Midcontinent rivers had moderately greater phytoplankton abundance than eastern and western rivers, presumably because midcontinent rivers were more eutrophic. The study also showed that qPCR- and microscope-determined phytoplankton abundance had a significant positive linear correlation (adjusted R² 0.772, p-value < 0.001). In addition, phytoplankton abundance assessed via qPCR showed promise as an indicator of the eutrophication status of those rivers, with oligotrophic rivers having low phytoplankton abundance and eutrophic rivers having (relatively) high phytoplankton abundance. This study demonstrated that qPCR could serve as an alternative tool to traditional microscopy for phytoplankton quantification and eutrophication assessment in freshwater rivers.

Keywords: phytoplankton, eutrophication, river, qPCR, microscopy, spatiotemporal variation

Procedia PDF Downloads 103
2316 Predicting Survival in Cancer: How Cox Regression Model Compares to Artifial Neural Networks?

Authors: Dalia Rimawi, Walid Salameh, Amal Al-Omari, Hadeel AbdelKhaleq

Abstract:

Predication of Survival time of patients with cancer, is a core factor that influences oncologist decisions in different aspects; such as offered treatment plans, patients’ quality of life and medications development. For a long time proportional hazards Cox regression (ph. Cox) was and still the most well-known statistical method to predict survival outcome. But due to the revolution of data sciences; new predication models were employed and proved to be more flexible and provided higher accuracy in that type of studies. Artificial neural network is one of those models that is suitable to handle time to event predication. In this study we aim to compare ph Cox regression with artificial neural network method according to data handling and Accuracy of each model.

Keywords: Cox regression, neural networks, survival, cancer.

Procedia PDF Downloads 203
2315 Argument Representation in Non-Spatial Motion Bahasa Melayu Based Conceptual Structure Theory

Authors: Nurul Jamilah Binti Rosly

Abstract:

The typology of motion must be understood as a change from one location to another. But from a conceptual point of view, motion can also occur in non-spatial contexts associated with human and social factors. Therefore, from the conceptual point of view, the concept of non-spatial motion involves the movement of time, ownership, identity, state, and existence. Accordingly, this study will focus on the lexical as shared, accept, be, store, and exist as the study material. The data in this study were extracted from the Database of Languages and Literature Corpus Database, Malaysia, which was analyzed using semantics and syntax concepts using Conceptual Structure Theory - Ray Jackendoff (2002). Semantic representations are represented in the form of conceptual structures in argument functions that include functions [events], [situations], [objects], [paths] and [places]. The findings show that the mapping of these arguments comprises three main stages, namely mapping the argument structure, mapping the tree, and mapping the role of thematic items. Accordingly, this study will show the representation of non- spatial Malay language areas.

Keywords: arguments, concepts, constituencies, events, situations, thematics

Procedia PDF Downloads 132
2314 Getting It Right Before Implementation: Using Simulation to Optimize Recommendations and Interventions After Adverse Event Review

Authors: Melissa Langevin, Natalie Ward, Colleen Fitzgibbons, Christa Ramsey, Melanie Hogue, Anna Theresa Lobos

Abstract:

Description: Root Cause Analysis (RCA) is used by health care teams to examine adverse events (AEs) to identify causes which then leads to recommendations for prevention Despite widespread use, RCA has limitations. Best practices have not been established for implementing recommendations or tracking the impact of interventions after AEs. During phase 1 of this study, we used simulation to analyze two fictionalized AEs that occurred in hospitalized paediatric patients to identify and understand how the errors occurred and generated recommendations to mitigate and prevent recurrences. Scenario A involved an error of commission (inpatient drug error), and Scenario B involved detecting an error that already occurred (critical care drug infusion error). Recommendations generated were: improved drug labeling, specialized drug kids, alert signs and clinical checklists. Aim: Use simulation to optimize interventions recommended post critical event analysis prior to implementation in the clinical environment. Methods: Suggested interventions from Phase 1 were designed and tested through scenario simulation in the clinical environment (medicine ward or pediatric intensive care unit). Each scenario was simulated 8 times. Recommendations were tested using different, voluntary teams and each scenario was debriefed to understand why the error was repeated despite interventions and how interventions could be improved. Interventions were modified with subsequent simulations until recommendations were felt to have an optimal effect and data saturation was achieved. Along with concrete suggestions for design and process change, qualitative data pertaining to employee communication and hospital standard work was collected and analyzed. Results: Each scenario had a total of three interventions to test. In, scenario 1, the error was reproduced in the initial two iterations and mitigated following key intervention changes. In scenario 2, the error was identified immediately in all cases where the intervention checklist was utilized properly. Independently of intervention changes and improvements, the simulation was beneficial to identify which of these should be prioritized for implementation and highlighted that even the potential solutions most frequently suggested by participants did not always translate into error prevention in the clinical environment. Conclusion: We conclude that interventions that help to change process (epinephrine kit or mandatory checklist) were more successful at preventing errors than passive interventions (signage, change in memory aids). Given that even the most successful interventions needed modifications and subsequent re-testing, simulation is key to optimizing suggested changes. Simulation is a safe, practice changing modality for institutions to use prior to implementing recommendations from RCA following AE reviews.

Keywords: adverse events, patient safety, pediatrics, root cause analysis, simulation

Procedia PDF Downloads 155
2313 EFL Vocabulary Learning Strategies among Students in Greece, Their Preferences and Internet Technology

Authors: Theodorou Kyriaki, Ypsilantis George

Abstract:

Vocabulary learning has attracted a lot of attention in recent years, contrary to the neglected part of the past. Along with the interest in finding successful vocabulary teaching strategies, many scholars focused on locating learning strategies used by language learners. As a result, more and more studies in the area of language pedagogy have been investigating the use of strategies in vocabulary learning by different types of learners. A common instrument in this field is the questionnaire, a tool of work that was enriched by questions involving current technology, and it was further implemented to a sample of 300 Greek students whose age varied from 9 and 17 years. Strategies located were grouped into the three categories of memory, cognitive, and compensatory type and associations between these dependent variables were investigated. In addition, relations between dependent and independent variables (such as age, sex, type of school, cultural background, and grade in English) were pursued to investigate the impact on strategy selection. Finally, results were compared to findings of other studies in the same field to contribute to a hypothesis of ethnic differences in strategy selection. Results initially discuss preferred strategies of all participants and further indicate that: a) technology affects strategy selection while b) differences between ethnic groups are not statistically significant. A number of successful strategies are presented, resulting from correlations of strategy selection and final school grade in English.

Keywords: acquisition of English, internet technology, research among Greek students, vocabulary learning strategies

Procedia PDF Downloads 512
2312 Comparison Between Bispectral Index Guided Anesthesia and Standard Anesthesia Care in Middle Age Adult Patients Undergoing Modified Radical Mastectomy

Authors: Itee Chowdhury, Shikha Modi

Abstract:

Introduction: Cancer is beginning to outpace cardiovascular disease as a cause of death affecting every major organ system with profound implications for perioperative management. Breast cancer is the most common cancer in women in India, accounting for 27% of all cancers. The small changes in analgesic management of cancer patients can greatly improve prognosis and reduce the risk of postsurgical cancer recurrence as opioid-based analgesia has a deleterious effect on cancer outcomes. Shortened postsurgical recovery time facilitates earlier return to intended oncological therapy maximising the chance of successful treatment. Literature reveals that the role of BIS since FDA approval has been assessed in various types of surgeries, but clinical data on its use in oncosurgical patients are scanty. Our study focuses on the role of BIS-guided anaesthesia for breast cancer surgery patients. Methods: A prospective randomized controlled study in patients aged 36-55years scheduled for modified radical mastectomy was conducted in 51 patients in each group who met the inclusion and exclusion criteria, and randomization was done by sealed envelope technique. In BIS guided anaesthesia group (B), sevoflurane was titrated to keep the BIS value 45-60, and thereafter if the patient showed hypertension/tachycardia, an opioid was given. In standard anaesthesia care (group C), sevoflurane was titrated to keep MAC in the range of 0.8-1, and fentanyl was given if the patient showed hypertension/tachycardia. Intraoperative opioid consumption was calculated. Postsurgery recovery characteristics, including Aldrete score, were assessed. Patients were questioned for pain, PONV, and recall of the intraoperative event. A comparison of age, BMI, ASA, recovery characteristics, opioid, and VAS score was made using the non-parametric Mann-Whitney U test. Categorical data like intraoperative awareness of surgery and PONV was studied using the Chi-square test. A comparison of heart rate and MAP was made by an independent sample t-test. #ggplot2 package was used to show the trend of the BIS index for all intraoperative time points for each patient. For a statistical test of significance, the cut-off p-value was set as <0.05. Conclusions: BIS monitoring led to reduced opioid consumption and early recovery from anaesthesia in breast cancer patients undergoing MRM resulting in less postoperative nausea and vomiting and less pain intensity in the immediate postoperative period without any recall of the intraoperative event. Thus, the use of a Bispectral index monitor allows for tailoring of anaesthesia administration with a good outcome.

Keywords: bispectral index, depth of anaesthesia, recovery, opioid consumption

Procedia PDF Downloads 130
2311 A Machine Learning Approach for Classification of Directional Valve Leakage in the Hydraulic Final Test

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Due to increasing cost pressure in global markets, artificial intelligence is becoming a technology that is decisive for competition. Predictive quality enables machinery and plant manufacturers to ensure product quality by using data-driven forecasts via machine learning models as a decision-making basis for test results. The use of cross-process Bosch production data along the value chain of hydraulic valves is a promising approach to classifying the quality characteristics of workpieces.

Keywords: predictive quality, hydraulics, machine learning, classification, supervised learning

Procedia PDF Downloads 234
2310 The Concentration of Formaldehyde in Rainwater and Typhoon Rainwater at Sakai City, Japan

Authors: Chinh Nguyen Nhu Bao, Hien To Thi, Norimichi Takenaka

Abstract:

Formaldehyde (HCHO) concentrations in rainwater including in tropical storms in Sakai City, Osaka, Japan have been measured continuously during rain event by developed chemiluminescence method. The level of formaldehyde was ranged from 15 µg/L to 500 µg/L. The high concentration of HCHO in rainwater is related to the wind direction from the south and west sides of Sakai City where manufactures related to chemicals, oil-refinery, and steel. The in-situ irradiated experiment on rainwater sample was conducted to prove the aqueous phase photo-production of HCHO and the degradation of HCHO. In the daytime, the aqueous phase photolysis is the source of HCHO in rainwater (4.52 ± 5.74 µg/L/h for UV light source in-situ condition, 2.84-8.96 µg/L/h under sunlight). However, in the night time, the degradation is the function of microorganism.

Keywords: chemiluminescence, formaldehyde, rainwater, typhoon

Procedia PDF Downloads 167
2309 Development of Risk Assessment and Occupational Safety Management Model for Building Construction Projects

Authors: Preeda Sansakorn, Min An

Abstract:

In order to be capable of dealing with uncertainties, subjectivities, including vagueness arising in building construction projects, the application of fuzzy reasoning technique based on fuzzy set theory is proposed. This study contributes significantly to the development of a fuzzy reasoning safety risk assessment model for building construction projects that could be employed to assess the risk magnitude of each hazardous event identified during construction, and a third parameter of probability of consequence is incorporated in the model. By using the proposed safety risk analysis methodology, more reliable and less ambiguities, which provide the safety risk management project team for decision-making purposes.

Keywords: safety risk assessment, building construction safety, fuzzy reasoning, construction risk assessment model, building construction projects

Procedia PDF Downloads 494
2308 The Effect of Social Media Influencer on Boycott Participation through Attitude toward the Offending Country in a Situational Animosity Context

Authors: Hsing-Hua Stella Chang, Mong-Ching Lin, Cher-Min Fong

Abstract:

Using surrogate boycotts as a coercive tactic to force the offending party into changing its approaches has been increasingly significant over the last several decades, and is expected to increase in the future. Research shows that surrogate boycotts are often triggered by controversial international events, and particular foreign countries serve as the offending party in the international marketplace. In other words, multinational corporations are likely to become surrogate boycott targets in overseas markets because of the animosity between their home and host countries. Focusing on the surrogate boycott triggered by a severe situation animosity, this research aims to examine how social media influencers (SMIs) serving as electronic key opinion leaders (EKOLs) in an international crisis facilitate and organize a boycott, and persuade consumers to participate in the boycott. This research suggests that SMIs could be a particularly important information source in a surrogate boycott sparked by a situation of animosity. This research suggests that under such a context, SMIs become a critical information source for individuals to enhance and update their understanding of the event because, unlike traditional media, social media serve as a platform for instant and 24-hour non-stop information access and dissemination. The Xinjiang cotton event was adopted as the research context, which was viewed as an ongoing inter-country conflict, reflecting a crisis, which provokes animosity against the West. Through online panel services, both studies recruited Mainland Chinese nationals to be respondents to the surveys. The findings show that: 1. Social media influencer message is positively related to a negative attitude toward the offending country. 2. Attitude toward the offending country is positively related to boycotting participation. To address the unexplored question – of the effect of social media influencer influence on consumer participation in boycotts, this research presents a finer-grained examination of boycott motivation, with a special focus on a situational animosity context. This research is split into two interrelated parts. In the first part, this research shows that attitudes toward the offending country can be socially constructed by the influence of social media influencers in a situational animosity context. The study results show that consumers perceive different strengths of social pressure related to various levels of influencer messages and thus exhibit different levels of attitude toward the offending country. In the second part, this research further investigates the effect of attitude toward the offending country on boycott participation. The study findings show that such attitude exacerbated the effect of social media influencer messages on boycott participation in a situation of animosity.

Keywords: animosity, social media marketing, boycott, attitude toward the offending country

Procedia PDF Downloads 114
2307 Polyethylene Terephthalate Plastic Degradation by Fungus Rasamsonia Emersonii

Authors: Naveen Kumar

Abstract:

Microplastics, tiny plastic particles less than 5 mm in size formed by the disposal and breakdown of industrial and consumer products, have become a primary environmental concern due to their ubiquitous presence and application in the environment and their potential to cause harm to the ecosystem, wildlife and human health. In this, we study the ability of the fungus Rasamsonia emersonii IMI 393752 to degrade the rigid microplastics of Coke bottles. Microplastics were extracted from Coke bottles and incubated with Rasamsonia emersonii in Sabouraud dextrose agar media. Microplastics were pre-sterilized without altering the chemistry of microplastic. Preliminary analysis was performed by observing radial growth assessment of microplastic-containing media enriched with fungi vs. control. The assay confirmed no impedance or change in the fungi's growth pattern and rate by introducing microplastics. The degradation of the microplastics was monitored over time using microscopy and FTIR, and biodegradation/deterioration on the plastic surface was observed. Furthermore, the liquid assay was performed. HPLC and GCMS will be conducted to check the biodegradation and presence of enzyme release by fungi to counteract the presence of microplastics. These findings have important implications for managing plastic waste, as they suggest that fungi such as Rasamsonia emersonii can potentially degrade microplastics safely and effectively. However, further research to optimise the conditions for microplastic degradation by Rasamsonia emersonii and to develop strategies for scaling up the process for industrial applications will be beneficial.

Keywords: bioremediation, mycoremediation, plastic degradtion, polyethylene terephthalate

Procedia PDF Downloads 99
2306 The Language of Fliptop among Filipino Youth: A Discourse Analysis

Authors: Bong Borero Lumabao

Abstract:

This qualitative research is a study on the lines of Fliptop talks performed by the Fliptop rappers employing Finnegan’s (2008) discourse analysis. This paper aimed to analyze the phonological, morphological, and semantic features of the fliptop talk, to explore the structures in the lines of Fliptop among Filipino youth, and to uncover the various insights that can be gained from it. The corpora of the study included all the 20 Fliptop Videos downloaded from the Youtube Channel of Fliptop. Results revealed that Fliptop contains phonological features such as assonance, consonance, deletion, lengthening, and rhyming. Morphological features include acronym, affixation, blending, borrowing, code-mixing and switching, compounding, conversion or functional shifts, and dysphemism. Semantics presented the lexical category, meaning, and words used in the fliptop talks. Structure of Fliptop revolves on the personal attack (physical attributes), attack on the bars (rapping skills), extension: family members and friends, antithesis, profane words, figurative languages, sexual undertones, anime characters, homosexuality, and famous celebrities involvement.

Keywords: discourse analysis, fliptop talks, filipino youth, fliptop videos, Philippines

Procedia PDF Downloads 249
2305 Conversion of Glycerol to 3-Hydroxypropanoic Acid by Genetically Engineered Bacillus subtilis

Authors: Aida Kalantari, Boyang Ji, Tao Chen, Ivan Mijakovic

Abstract:

3-hydroxypropanoic acid (3-HP) is one of the most important biomass-derivable platform chemicals that can be converted into a number of industrially important compounds. There have been several attempts at production of 3-HP from renewable sources in cell factories, focusing mainly on Escherichia coli, Klebsiella pneumoniae, and Saccharomyces cerevisiae. Despite the significant progress made in this field, commercially exploitable large-scale production of 3-HP in microbial strains has still not been achieved. In this study, we investigated the potential of Bacillus subtilis to be used as a microbial platform for bioconversion of glycerol into 3-HP. Our recombinant B. subtilis strains overexpress the two-step heterologous pathway containing glycerol dehydratase and aldehyde dehydrogenase from various backgrounds. The recombinant strains harboring the codon-optimized synthetic pathway from K. pneumoniae produced low levels of 3-HP. Since the enzymes in the heterologous pathway are sensitive to oxygen, we had to perform our experiments in micro-aerobic conditions. Under these conditions, the cell produces lactate in order to regenerate NAD+, and we found the lactate production to be in competition with the production of 3-HP. Therefore, based on the in silico predictions, we knocked out the glycerol kinase (glpk), which in combination with growth on glucose, resulted in improving the 3-HP titer to 1 g/L and the removal of lactate. Cultivation of the same strain in an enriched medium improved the 3-HP titer up to 7.6 g/L. Our findings provide the first report of successful introduction of the biosynthetic pathway for conversion of glycerol into 3-HP in B. subtilis.

Keywords: bacillus subtilis, glycerol, 3-hydroxypropanoic acid, metabolic engineering

Procedia PDF Downloads 248
2304 Artificial Intelligence-Based Thermal Management of Battery System for Electric Vehicles

Authors: Raghunandan Gurumurthy, Aricson Pereira, Sandeep Patil

Abstract:

The escalating adoption of electric vehicles (EVs) across the globe has underscored the critical importance of advancing battery system technologies. This has catalyzed a shift towards the design and development of battery systems that not only exhibit higher energy efficiency but also boast enhanced thermal performance and sophisticated multi-material enclosures. A significant leap in this domain has been the incorporation of simulation-based design optimization for battery packs and Battery Management Systems (BMS), a move further enriched by integrating artificial intelligence/machine learning (AI/ML) approaches. These strategies are pivotal in refining the design, manufacturing, and operational processes for electric vehicles and energy storage systems. By leveraging AI/ML, stakeholders can now predict battery performance metrics—such as State of Health, State of Charge, and State of Power—with unprecedented accuracy. Furthermore, as Li-ion batteries (LIBs) become more prevalent in urban settings, the imperative for bolstering thermal and fire resilience has intensified. This has propelled Battery Thermal Management Systems (BTMs) to the forefront of energy storage research, highlighting the role of machine learning and AI not just as tools for enhanced safety management through accurate temperature forecasts and diagnostics but also as indispensable allies in the early detection and warning of potential battery fires.

Keywords: electric vehicles, battery thermal management, industrial engineering, machine learning, artificial intelligence, manufacturing

Procedia PDF Downloads 99
2303 Introduction of the Harmfulness of the Seismic Signal in the Assessment of the Performance of Reinforced Concrete Frame Structures

Authors: Kahil Amar, Boukais Said, Kezmane Ali, Hannachi Naceur Eddine, Hamizi Mohand

Abstract:

The principle of the seismic performance evaluation methods is to provide a measure of capability for a building or set of buildings to be damaged by an earthquake. The common objective of many of these methods is to supply classification criteria. The purpose of this study is to present a method for assessing the seismic performance of structures, based on Pushover method, we are particularly interested in reinforced concrete frame structures, which represent a significant percentage of damaged structures after a seismic event. The work is based on the characterization of seismic movement of the various earthquake zones in terms of PGA and PGD that is obtained by means of SIMQK_GR and PRISM software and the correlation between the points of performance and the scalar characterizing the earthquakes will be developed.

Keywords: seismic performance, pushover method, characterization of seismic motion, harmfulness of the seismic

Procedia PDF Downloads 384
2302 Heritage Tree Expert Assessment and Classification: Malaysian Perspective

Authors: B.-Y.-S. Lau, Y.-C.-T. Jonathan, M.-S. Alias

Abstract:

Heritage trees are natural large, individual trees with exceptionally value due to association with age or event or distinguished people. In Malaysia, there is an abundance of tropical heritage trees throughout the country. It is essential to set up a repository of heritage trees to prevent valuable trees from being cut down. In this cross domain study, a web-based online expert system namely the Heritage Tree Expert Assessment and Classification (HTEAC) is developed and deployed for public to nominate potential heritage trees. Based on the nomination, tree care experts or arborists would evaluate and verify the nominated trees as heritage trees. The expert system automatically rates the approved heritage trees according to pre-defined grades via Delphi technique. Features and usability test of the expert system are presented. Preliminary result is promising for the system to be used as a full scale public system.

Keywords: arboriculture, Delphi, expert system, heritage tree, urban forestry

Procedia PDF Downloads 316
2301 Incorporation of Noncanonical Amino Acids into Hard-to-Express Antibody Fragments: Expression and Characterization

Authors: Hana Hanaee-Ahvaz, Monika Cserjan-Puschmann, Christopher Tauer, Gerald Striedner

Abstract:

Incorporation of noncanonical amino acids (ncAA) into proteins has become an interesting topic as proteins featured with ncAAs offer a wide range of different applications. Nowadays, technologies and systems exist that allow for the site-specific introduction of ncAAs in vivo, but the efficient production of proteins modified this way is still a big challenge. This is especially true for 'hard-to-express' proteins where low yields are encountered even with the native sequence. In this study, site-specific incorporation of azido-ethoxy-carbonyl-Lysin (azk) into an anti-tumor-necrosis-factor-α-Fab (FTN2) was investigated. According to well-established parameters, possible site positions for ncAA incorporation were determined, and corresponding FTN2 genes were constructed. Each of the modified FTN2 variants has one amber codon for azk incorporated either in its heavy or light chain. The expression level for all variants produced was determined by ELISA, and all azk variants could be produced with a satisfactory yield in the range of 50-70% of the original FTN2 variant. In terms of expression yield, neither the azk incorporation position nor the subunit modified (heavy or light chain) had a significant effect. We confirmed correct protein processing and azk incorporation by mass spectrometry analysis, and antigen-antibody interaction was determined by surface plasmon resonance analysis. The next step is to characterize the effect of azk incorporation on protein stability and aggregation tendency via differential scanning calorimetry and light scattering, respectively. In summary, the incorporation of ncAA into our Fab candidate FTN2 worked better than expected. The quantities produced allowed a detailed characterization of the variants in terms of their properties, and we can now turn our attention to potential applications. By using click chemistry, we can equip the Fabs with additional functionalities and make them suitable for a wide range of applications. We will now use this option in a first approach and develop an assay that will allow us to follow the degradation of the recombinant target protein in vivo. Special focus will be laid on the proteolytic activity in the periplasm and how it is influenced by cultivation/induction conditions.

Keywords: degradation, FTN2, hard-to-express protein, non-canonical amino acids

Procedia PDF Downloads 236
2300 A Decision-Support Tool for Humanitarian Distribution Planners in the Face of Congestion at Security Checkpoints: A Real-World Case Study

Authors: Mohanad Rezeq, Tarik Aouam, Frederik Gailly

Abstract:

In times of armed conflicts, various security checkpoints are placed by authorities to control the flow of merchandise into and within areas of conflict. The flow of humanitarian trucks that is added to the regular flow of commercial trucks, together with the complex security procedures, creates congestion and long waiting times at the security checkpoints. This causes distribution costs to increase and shortages of relief aid to the affected people to occur. Our research proposes a decision-support tool to assist planners and policymakers in building efficient plans for the distribution of relief aid, taking into account congestion at security checkpoints. The proposed tool is built around a multi-item humanitarian distribution planning model based on multi-phase design science methodology that has as its objective to minimize distribution and back ordering costs subject to capacity constraints that reflect congestion effects using nonlinear clearing functions. Using the 2014 Gaza War as a case study, we illustrate the application of the proposed tool, model the underlying relief-aid humanitarian supply chain, estimate clearing functions at different security checkpoints, and conduct computational experiments. The decision support tool generated a shipment plan that was compared to two benchmarks in terms of total distribution cost, average lead time and work in progress (WIP) at security checkpoints, and average inventory and backorders at distribution centers. The first benchmark is the shipment plan generated by the fixed capacity model, and the second is the actual shipment plan implemented by the planners during the armed conflict. According to our findings, modeling and optimizing supply chain flows reduce total distribution costs, average truck wait times at security checkpoints, and average backorders when compared to the executed plan and the fixed-capacity model. Finally, scenario analysis concludes that increasing capacity at security checkpoints can lower total operations costs by reducing the average lead time.

Keywords: humanitarian distribution planning, relief-aid distribution, congestion, clearing functions

Procedia PDF Downloads 83
2299 Improved Performance in Content-Based Image Retrieval Using Machine Learning Approach

Authors: B. Ramesh Naik, T. Venugopal

Abstract:

This paper presents a novel approach which improves the high-level semantics of images based on machine learning approach. The contemporary approaches for image retrieval and object recognition includes Fourier transforms, Wavelets, SIFT and HoG. Though these descriptors helpful in a wide range of applications, they exploit zero order statistics, and this lacks high descriptiveness of image features. These descriptors usually take benefit of primitive visual features such as shape, color, texture and spatial locations to describe images. These features do not adequate to describe high-level semantics of the images. This leads to a gap in semantic content caused to unacceptable performance in image retrieval system. A novel method has been proposed referred as discriminative learning which is derived from machine learning approach that efficiently discriminates image features. The analysis and results of proposed approach were validated thoroughly on WANG and Caltech-101 Databases. The results proved that this approach is very competitive in content-based image retrieval.

Keywords: CBIR, discriminative learning, region weight learning, scale invariant feature transforms

Procedia PDF Downloads 184
2298 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 133
2297 A Two-Step, Temperature-Staged, Direct Coal Liquefaction Process

Authors: Reyna Singh, David Lokhat, Milan Carsky

Abstract:

The world crude oil demand is projected to rise to 108.5 million bbl/d by the year 2035. With reserves estimated at 869 billion tonnes worldwide, coal is an abundant resource. This work was aimed at producing a high value hydrocarbon liquid product from the Direct Coal Liquefaction (DCL) process at, comparatively, mild operating conditions. Via hydrogenation, the temperature-staged approach was investigated. In a two reactor lab-scale pilot plant facility, the objectives included maximising thermal dissolution of the coal in the presence of a hydrogen donor solvent in the first stage, subsequently promoting hydrogen saturation and hydrodesulphurization (HDS) performance in the second. The feed slurry consisted of high grade, pulverized bituminous coal on a moisture-free basis with a size fraction of < 100μm; and Tetralin mixed in 2:1 and 3:1 solvent/coal ratios. Magnetite (Fe3O4) at 0.25wt% of the dry coal feed was added for the catalysed runs. For both stages, hydrogen gas was used to maintain a system pressure of 100barg. In the first stage, temperatures of 250℃ and 300℃, reaction times of 30 and 60 minutes were investigated in an agitated batch reactor. The first stage liquid product was pumped into the second stage vertical reactor, which was designed to counter-currently contact the hydrogen rich gas stream and incoming liquid flow in the fixed catalyst bed. Two commercial hydrotreating catalysts; Cobalt-Molybdenum (CoMo) and Nickel-Molybdenum (NiMo); were compared in terms of their conversion, selectivity and HDS performance at temperatures 50℃ higher than the respective first stage tests. The catalysts were activated at 300°C with a hydrogen flowrate of approximately 10 ml/min prior to the testing. A gas-liquid separator at the outlet of the reactor ensured that the gas was exhausted to the online VARIOplus gas analyser. The liquid was collected and sampled for analysis using Gas Chromatography-Mass Spectrometry (GC-MS). Internal standard quantification methods for the sulphur content, the BTX (benzene, toluene, and xylene) and alkene quality; alkanes and polycyclic aromatic hydrocarbon (PAH) compounds in the liquid products were guided by ASTM standards of practice for hydrocarbon analysis. In the first stage, using a 2:1 solvent/coal ratio, an increased coal to liquid conversion was favoured by a lower operating temperature of 250℃, 60 minutes and a system catalysed by magnetite. Tetralin functioned effectively as the hydrogen donor solvent. A 3:1 ratio favoured increased concentrations of the long chain alkanes undecane and dodecane, unsaturated alkenes octene and nonene and PAH compounds such as indene. The second stage product distribution showed an increase in the BTX quality of the liquid product, branched chain alkanes and a reduction in the sulphur concentration. As an HDS performer and selectivity to the production of long and branched chain alkanes, NiMo performed better than CoMo. CoMo is selective to a higher concentration of cyclohexane. For 16 days on stream each, NiMo had a higher activity than CoMo. The potential to cover the demand for low–sulphur, crude diesel and solvents from the production of high value hydrocarbon liquid in the said process, is thus demonstrated.

Keywords: catalyst, coal, liquefaction, temperature-staged

Procedia PDF Downloads 649
2296 Production of Buttermilk as a Bio-Active Functional Food by Utilizing Dairy Waste

Authors: Hafsa Tahir, Sanaullah Iqbal

Abstract:

Glactooligosaccharide (GOS) is a type of prebiotic which is mainly found in human milk. GOS belongs to those bacteria which stimulates the growth of beneficial bacteria in human intestines. The aim of the present study was to develop a value-added product by producing prebiotic (GOS) in buttermilk through trans galactosylation. Buttermilk is considered as an industrial waste which is discarded after the production of butter and cream. It contains protein, minerals, vitamins and a smaller amount of fat. Raw milk was pasteurized at 100º C for butter production and then trans galactosylation process was induced in the butter milk thus obtained to produce prebiotic GOS. Results showed that the enzyme (which was obtained from bacterial strain of Esecrshia coli and has a gene of Lactobacillus reuteri L103) concentration between 400-600µl/5ml can produce GOS in 30 minutes. Chemical analysis and sensory evaluation of plain and GOS containing buttermilk showed no remarkable difference in their composition. Furthermore, the shelf-life study showed that there was non-significant (P>0.05) difference in glass and pouch packaging of buttermilk. Buttermilk in pouch packaging maintained its stability for 6 days without the addition of preservatives. Therefore it is recommended that GOS enriched buttermilk which is generally considered as a processing waste in dairy manufacturing can be turned into a cost-effective nutritional functional food product. This will not only enhance the production efficiency of butter processing but also will create a new market opportunity for dairy manufacturers all over the world.

Keywords: buttermilk, galactooligosaccharide, shelf Life, transgalactosylation

Procedia PDF Downloads 293
2295 Daily Variations of Particulate Matter (PM10) in Industrial Sites in an Suburban Area of Sour El Ghozlane, Algeria

Authors: Sidali Khedidji, Riad Ladji, Noureddine Yassaa

Abstract:

In this study, particulate matter (PM10) which are hazardous for environment and human health were investigated in Sour El Ghozlane suburban atmosphere at a sampling point from March 2013 to April 2013. Ambient concentration measurements of polycyclic aromatic hydrocarbons were carried out at a regional study of the cement industry in Sour El Ghozlane. During sampling, the airborne particulate matter was enriched onto PTFE filters by using a two medium volume samplers with or without a size-selective inlet for PM10 and TSP were used and each sampling period lasted approximately 24 h. The organic compounds were characterized using gas chromatography coupled with mass spectrometric detection (GC-MSD). Total concentrations for PAHs recorded in sour el ghozlane suburban ranged from 101 to 204 ng m-3. Gravimeter method was applied to the black smoke concentration data for Springer seasons. The 24 h average concentrations of PM10 and TSP of Sour El Ghozlane suburban atmosphere were found in the range 4.76–165.76 μg/m3 and 28.63–800.14 μg/m3, respectively, in the sampling period. Meteorological factors, such as (relative humidity and temperature) were typically found to be affecting PMs, especially PM10. Air temperature did not seem to be significantly affecting TSP and PM10 mass concentrations.The guide value fixed by the European Community «40 μg/m3» not to exceed 35 days, were exceeded in some samples. However, it should be noted that the value limit fixed by the Algerian regulations «80 μg/m3» has been exceeded in 3 samplers during the period study.

Keywords: PAHs, PM10, TSP, particulate matter, cement industry

Procedia PDF Downloads 378
2294 A Pipeline for Detecting Copy Number Variation from Whole Exome Sequencing Using Comprehensive Tools

Authors: Cheng-Yang Lee, Petrus Tang, Tzu-Hao Chang

Abstract:

Copy number variations (CNVs) have played an important role in many kinds of human diseases, such as Autism, Schizophrenia and a number of cancers. Many diseases are found in genome coding regions and whole exome sequencing (WES) is a cost-effective and powerful technology in detecting variants that are enriched in exons and have potential applications in clinical setting. Although several algorithms have been developed to detect CNVs using WES and compared with other algorithms for finding the most suitable methods using their own samples, there were not consistent datasets across most of algorithms to evaluate the ability of CNV detection. On the other hand, most of algorithms is using command line interface that may greatly limit the analysis capability of many laboratories. We create a series of simulated WES datasets from UCSC hg19 chromosome 22, and then evaluate the CNV detective ability of 19 algorithms from OMICtools database using our simulated WES datasets. We compute the sensitivity, specificity and accuracy in each algorithm for validation of the exome-derived CNVs. After comparison of 19 algorithms from OMICtools database, we construct a platform to install all of the algorithms in a virtual machine like VirtualBox which can be established conveniently in local computers, and then create a simple script that can be easily to use for detecting CNVs using algorithms selected by users. We also build a table to elaborate on many kinds of events, such as input requirement, CNV detective ability, for all of the algorithms that can provide users a specification to choose optimum algorithms.

Keywords: whole exome sequencing, copy number variations, omictools, pipeline

Procedia PDF Downloads 321
2293 Progressive Multimedia Collection Structuring via Scene Linking

Authors: Aman Berhe, Camille Guinaudeau, Claude Barras

Abstract:

In order to facilitate information seeking in large collections of multimedia documents with long and progressive content (such as broadcast news or TV series), one can extract the semantic links that exist between semantically coherent parts of documents, i.e., scenes. The links can then create a coherent collection of scenes from which it is easier to perform content analysis, topic extraction, or information retrieval. In this paper, we focus on TV series structuring and propose two approaches for scene linking at different levels of granularity (episode and season): a fuzzy online clustering technique and a graph-based community detection algorithm. When evaluated on the two first seasons of the TV series Game of Thrones, we found that the fuzzy online clustering approach performed better compared to graph-based community detection at the episode level, while graph-based approaches show better performance at the season level.

Keywords: multimedia collection structuring, progressive content, scene linking, fuzzy clustering, community detection

Procedia PDF Downloads 104
2292 A Review on Artificial Neural Networks in Image Processing

Authors: B. Afsharipoor, E. Nazemi

Abstract:

Artificial neural networks (ANNs) are powerful tool for prediction which can be trained based on a set of examples and thus, it would be useful for nonlinear image processing. The present paper reviews several paper regarding applications of ANN in image processing to shed the light on advantage and disadvantage of ANNs in this field. Different steps in the image processing chain including pre-processing, enhancement, segmentation, object recognition, image understanding and optimization by using ANN are summarized. Furthermore, results on using multi artificial neural networks are presented.

Keywords: neural networks, image processing, segmentation, object recognition, image understanding, optimization, MANN

Procedia PDF Downloads 413
2291 Exploring Managerial Approaches towards Green Manufacturing: A Thematic Analysis

Authors: Hakimeh Masoudigavgani

Abstract:

Since manufacturing firms deplete non-renewable resources and pollute air, soil, and water in greatly unsustainable manner, industrial activities or production of products are considered to be a key contributor to adverse environmental impacts. Hence, management strategies and approaches that involve an effective supply chain decision process in a manufacturing sector could be extremely significant to the application of environmental initiatives. Green manufacturing (GM) is one of these strategies which minimises negative effects on the environment through reducing greenhouse gas emissions, waste, and the consumption of energy and natural resources. This paper aims to explore what greening methods and mechanisms could be applied in the manufacturing supply chain and what are the outcomes of adopting these methods in terms of abating environmental burdens? The study is an interpretive research with an exploratory approach, using thematic analysis by coding text, breaking down and grouping the content of collected literature into various themes and categories. It is found that green supply chain could be attained through execution of some pre-production strategies including green building, eco-design, and green procurement as well as a number of in-production and post-production strategies involving green manufacturing and green logistics. To achieve an effective GM, the pre-production strategies are suggested to be employed. This paper defines GM as (1) the analysis of the ecological impacts generated by practices, products, production processes, and operational functions, and (2) the implementation of greening methods to reduce damaging influences of them on the natural environment. Analysis means assessing, monitoring, and auditing of practices in order to measure and pinpoint their harmful impacts. Moreover, greening methods involved within GM (arranged in order from the least to the most level of environmental compliance and techniques) consist of: •product stewardship (e.g. less use of toxic, non-renewable, and hazardous materials in the manufacture of the product; and stewardship of the environmental problems with regard to the product in all production, use, and end-of-life stages); •process stewardship (e.g. controlling carbon emission, energy and resources usage, transportation method, and disposal; reengineering polluting processes; recycling waste materials generated in production); •lean and clean production practices (e.g. elimination of waste, materials replacement, materials reduction, resource-efficient consumption, energy-efficient usage, emission reduction, managerial assessment, waste re-use); •use of eco-industrial parks (e.g. a shared warehouse, shared logistics management system, energy co-generation plant, effluent treatment). However, the focus of this paper is only on methods related to the in-production phase and needs further research on both pre-production and post-production environmental innovations. The outlined methods in this investigation may possibly be taken into account by policy/decision makers. Additionally, the proposed future research direction and identified gaps can be filled by scholars and researchers. The paper compares and contrasts a variety of viewpoints and enhances the body of knowledge by building a definition for GM through synthesising literature and categorising the strategic concept of greening methods, drivers, barriers, and successful implementing tactics.

Keywords: green manufacturing (GM), product stewardship, process stewardship, clean production, eco-industrial parks (EIPs)

Procedia PDF Downloads 583
2290 Theoretical Modelling of Molecular Mechanisms in Stimuli-Responsive Polymers

Authors: Catherine Vasnetsov, Victor Vasnetsov

Abstract:

Context: Thermo-responsive polymers are materials that undergo significant changes in their physical properties in response to temperature changes. These polymers have gained significant attention in research due to their potential applications in various industries and medicine. However, the molecular mechanisms underlying their behavior are not well understood, particularly in relation to cosolvency, which is crucial for practical applications. Research Aim: This study aimed to theoretically investigate the phenomenon of cosolvency in long-chain polymers using the Flory-Huggins statistical-mechanical framework. The main objective was to understand the interactions between the polymer, solvent, and cosolvent under different conditions. Methodology: The research employed a combination of Monte Carlo computer simulations and advanced machine-learning methods. The Flory-Huggins mean field theory was used as the basis for the simulations. Spinodal graphs and ternary plots were utilized to develop an initial computer model for predicting polymer behavior. Molecular dynamic simulations were conducted to mimic real-life polymer systems. Machine learning techniques were incorporated to enhance the accuracy and reliability of the simulations. Findings: The simulations revealed that the addition of very low or very high volumes of cosolvent molecules resulted in smaller radii of gyration for the polymer, indicating poor miscibility. However, intermediate volume fractions of cosolvent led to higher radii of gyration, suggesting improved miscibility. These findings provide a possible microscopic explanation for the cosolvency phenomenon in polymer systems. Theoretical Importance: This research contributes to a better understanding of the behavior of thermo-responsive polymers and the role of cosolvency. The findings provide insights into the molecular mechanisms underlying cosolvency and offer specific predictions for future experimental investigations. The study also presents a more rigorous analysis of the Flory-Huggins free energy theory in the context of polymer systems. Data Collection and Analysis Procedures: The data for this study was collected through Monte Carlo computer simulations and molecular dynamic simulations. The interactions between the polymer, solvent, and cosolvent were analyzed using the Flory-Huggins mean field theory. Machine learning techniques were employed to enhance the accuracy of the simulations. The collected data was then analyzed to determine the impact of cosolvent volume fractions on the radii of gyration of the polymer. Question Addressed: The research addressed the question of how cosolvency affects the behavior of long-chain polymers. Specifically, the study aimed to investigate the interactions between the polymer, solvent, and cosolvent under different volume fractions and understand the resulting changes in the radii of gyration. Conclusion: In conclusion, this study utilized theoretical modeling and computer simulations to investigate the phenomenon of cosolvency in long-chain polymers. The findings suggest that moderate cosolvent volume fractions can lead to improved miscibility, as indicated by higher radii of gyration. These insights contribute to a better understanding of the molecular mechanisms underlying cosolvency in polymer systems and provide predictions for future experimental studies. The research also enhances the theoretical analysis of the Flory-Huggins free energy theory.

Keywords: molecular modelling, flory-huggins, cosolvency, stimuli-responsive polymers

Procedia PDF Downloads 72
2289 From the “Movement Language” to Communication Language

Authors: Mahmudjon Kuchkarov, Marufjon Kuchkarov

Abstract:

The origin of ‘Human Language’ is still a secret and the most interesting subject of historical linguistics. The core element is the nature of labeling or coding the things or processes with symbols and sounds. In this paper, we investigate human’s involuntary Paired Sounds and Shape Production (PSSP) and its contribution to the development of early human communication. Aimed at twenty-six volunteers who provided many physical movements with various difficulties, the research team investigated the natural, repeatable, and paired sounds and shape productions during human activities. The paper claims the involvement of Paired Sounds and Shape Production (PSSP) in the phonetic origin of some modern words and the existence of similarities between elements of PSSP with characters of the classic Latin alphabet. The results may be used not only as a supporting idea for existing theories but to create a closer look at some fundamental nature of the origin of the languages as well.

Keywords: body shape, body language, coding, Latin alphabet, merging method, movement language, movement sound, natural sound, origin of language, pairing, phonetics, sound and shape production, word origin, word semantic

Procedia PDF Downloads 256
2288 Prediction of Nonlinear Torsional Behavior of High Strength RC Beams

Authors: Woo-Young Jung, Minho Kwon

Abstract:

Seismic design criteria based on performance of structures have recently been adopted by practicing engineers in response to destructive earthquakes. A simple but efficient structural-analysis tool capable of predicting both the strength and ductility is needed to analyze reinforced concrete (RC) structures under such event. A three-dimensional lattice model is developed in this study to analyze torsions in high-strength RC members. Optimization techniques for determining optimal variables in each lattice model are introduced. Pure torsion tests of RC members are performed to validate the proposed model. Correlation studies between the numerical and experimental results confirm that the proposed model is well capable of representing salient features of the experimental results.

Keywords: torsion, non-linear analysis, three-dimensional lattice, high-strength concrete

Procedia PDF Downloads 351