Search results for: Signal Processing
28 Familiarity with Intercultural Conflicts and Global Work Performance: Testing a Theory of Recognition Primed Decision-Making
Authors: Thomas Rockstuhl, Kok Yee Ng, Guido Gianasso, Soon Ang
Abstract:
Two meta-analyses show that intercultural experience is not related to intercultural adaptation or performance in international assignments. These findings have prompted calls for a deeper grounding of research on international experience in the phenomenon of global work. Two issues, in particular, may limit current understanding of the relationship between international experience and global work performance. First, intercultural experience is too broad a construct that may not sufficiently capture the essence of global work, which to a large part involves sensemaking and managing intercultural conflicts. Second, the psychological mechanisms through which intercultural experience affects performance remains under-explored, resulting in a poor understanding of how experience is translated into learning and performance outcomes. Drawing on recognition primed decision-making (RPD) research, the current study advances a cognitive processing model to highlight the importance of intercultural conflict familiarity. Compared to intercultural experience, intercultural conflict familiarity is a more targeted construct that captures individuals’ previous exposure to dealing with intercultural conflicts. Drawing on RPD theory, we argue that individuals’ intercultural conflict familiarity enhances their ability to make accurate judgments and generate effective responses when intercultural conflicts arise. In turn, the ability to make accurate situation judgements and effective situation responses is an important predictor of global work performance. A relocation program within a multinational enterprise provided the context to test these hypotheses using a time-lagged, multi-source field study. Participants were 165 employees (46% female; with an average of 5 years of global work experience) from 42 countries who relocated from country to regional offices as part a global restructuring program. Within the first two weeks of transfer to the regional office, employees completed measures of their familiarity with intercultural conflicts, cultural intelligence, cognitive ability, and demographic information. They also completed an intercultural situational judgment test (iSJT) to assess their situation judgment and situation response. The iSJT comprised four validated multimedia vignettes of challenging intercultural work conflicts and prompted employees to provide protocols of their situation judgment and situation response. Two research assistants, trained in intercultural management but blind to the study hypotheses, coded the quality of employee’s situation judgment and situation response. Three months later, supervisors rated employees’ global work performance. Results using multilevel modeling (vignettes nested within employees) support the hypotheses that greater familiarity with intercultural conflicts is positively associated with better situation judgment, and that situation judgment mediates the effect of intercultural familiarity on situation response quality. Also, aggregated situation judgment and situation response quality both predicted supervisor-rated global work performance. Theoretically, our findings highlight the important but under-explored role of familiarity with intercultural conflicts; a shift in attention from the general nature of international experience assessed in terms of number and length of overseas assignments. Also, our cognitive approach premised on RPD theory offers a new theoretical lens to understand the psychological mechanisms through which intercultural conflict familiarity affects global work performance. Third, and importantly, our study contributes to the global talent identification literature by demonstrating that the cognitive processes engaged in resolving intercultural conflicts predict actual performance in the global workplace.Keywords: intercultural conflict familiarity, job performance, judgment and decision making, situational judgment test
Procedia PDF Downloads 17927 Linguistic Insights Improve Semantic Technology in Medical Research and Patient Self-Management Contexts
Authors: William Michael Short
Abstract:
Semantic Web’ technologies such as the Unified Medical Language System Metathesaurus, SNOMED-CT, and MeSH have been touted as transformational for the way users access online medical and health information, enabling both the automated analysis of natural-language data and the integration of heterogeneous healthrelated resources distributed across the Internet through the use of standardized terminologies that capture concepts and relationships between concepts that are expressed differently across datasets. However, the approaches that have so far characterized ‘semantic bioinformatics’ have not yet fulfilled the promise of the Semantic Web for medical and health information retrieval applications. This paper argues within the perspective of cognitive linguistics and cognitive anthropology that four features of human meaning-making must be taken into account before the potential of semantic technologies can be realized for this domain. First, many semantic technologies operate exclusively at the level of the word. However, texts convey meanings in ways beyond lexical semantics. For example, transitivity patterns (distributions of active or passive voice) and modality patterns (configurations of modal constituents like may, might, could, would, should) convey experiential and epistemic meanings that are not captured by single words. Language users also naturally associate stretches of text with discrete meanings, so that whole sentences can be ascribed senses similar to the senses of words (so-called ‘discourse topics’). Second, natural language processing systems tend to operate according to the principle of ‘one token, one tag’. For instance, occurrences of the word sound must be disambiguated for part of speech: in context, is sound a noun or a verb or an adjective? In syntactic analysis, deterministic annotation methods may be acceptable. But because natural language utterances are typically characterized by polyvalency and ambiguities of all kinds (including intentional ambiguities), such methods leave the meanings of texts highly impoverished. Third, ontologies tend to be disconnected from everyday language use and so struggle in cases where single concepts are captured through complex lexicalizations that involve profile shifts or other embodied representations. More problematically, concept graphs tend to capture ‘expert’ technical models rather than ‘folk’ models of knowledge and so may not match users’ common-sense intuitions about the organization of concepts in prototypical structures rather than Aristotelian categories. Fourth, and finally, most ontologies do not recognize the pervasively figurative character of human language. However, since the time of Galen the widespread use of metaphor in the linguistic usage of both medical professionals and lay persons has been recognized. In particular, metaphor is a well-documented linguistic tool for communicating experiences of pain. Because semantic medical knowledge-bases are designed to help capture variations within technical vocabularies – rather than the kinds of conventionalized figurative semantics that practitioners as well as patients actually utilize in clinical description and diagnosis – they fail to capture this dimension of linguistic usage. The failure of semantic technologies in these respects degrades the efficiency and efficacy not only of medical research, where information retrieval inefficiencies can lead to direct financial costs to organizations, but also of care provision, especially in contexts of patients’ self-management of complex medical conditions.Keywords: ambiguity, bioinformatics, language, meaning, metaphor, ontology, semantic web, semantics
Procedia PDF Downloads 13226 The Prospects of Optimized KOH/Cellulose 'Papers' as Hierarchically Porous Electrode Materials for Supercapacitor Devices
Authors: Dina Ibrahim Abouelamaiem, Ana Jorge Sobrido, Magdalena Titirici, Paul R. Shearing, Daniel J. L. Brett
Abstract:
Global warming and scarcity of fossil fuels have had a radical impact on the world economy and ecosystem. The urgent need for alternative energy sources has hence elicited an extensive research for exploiting efficient and sustainable means of energy conversion and storage. Among various electrochemical systems, supercapacitors attracted significant attention in the last decade due to their high power supply, long cycle life compared to batteries and simple mechanism. Recently, the performance of these devices has drastically improved, as tuning of nanomaterials provided efficient charge and storage mechanisms. Carbon materials, in various forms, are believed to pioneer the next generation of supercapacitors due to their attractive properties that include high electronic conductivities, high surface areas and easy processing and functionalization. Cellulose has eco-friendly attributes that are feasible to replace man-made fibers. The carbonization of cellulose yields carbons, including activated carbon and graphite fibers. Activated carbons successively are the most exploited candidates for supercapacitor electrode materials that can be complemented with pseudocapacitive materials to achieve high energy and power densities. In this work, the optimum functionalization conditions of cellulose have been investigated for supercapacitor electrode materials. The precursor was treated with potassium hydroxide (KOH) at different KOH/cellulose ratios prior to the carbonization process in an inert nitrogen atmosphere at 850 °C. The chalky products were washed, dried and characterized with different techniques including transmission electron microscopy (TEM), x-ray tomography and nitrogen adsorption-desorption isotherms. The morphological characteristics and their effect on the electrochemical performances were investigated in two and three-electrode systems. The KOH/cellulose ratios of 0.5:1 and 1:1 exhibited the highest performances with their unique hierarchal porous network structure, high surface areas and low cell resistances. Both samples acquired the best results in three-electrode systems and coin cells with specific gravimetric capacitances as high as 187 F g-1 and 20 F g-1 at a current density of 1 A g-1 and retention rates of 72% and 70%, respectively. This is attributed to the morphology of the samples that constituted of a well-balanced micro-, meso- and macro-porosity network structure. This study reveals that the electrochemical performance doesn’t solely depend on high surface areas but also an optimum pore size distribution, specifically at low current densities. The micro- and meso-pore contribution to the final pore structure was found to dominate at low KOH loadings, reaching ‘equilibrium’ with macropores at the optimum KOH loading, after which macropores dictate the porous network. The wide range of pore sizes is detrimental for the mobility and penetration of electrolyte ions in the porous structures. These findings highlight the influence of various morphological factors on the double-layer capacitances and high performance rates. In addition, they open a platform for the investigation of the optimized conditions for double-layer capacitance that can be coupled with pseudocapacitive materials to yield higher energy densities and capacities.Keywords: carbon, electrochemical performance, electrodes, KOH/cellulose optimized ratio, morphology, supercapacitor
Procedia PDF Downloads 21925 [Keynote Speech]: Evidence-Based Outcome Effectiveness Longitudinal Study on Three Approaches to Reduce Proactive and Reactive Aggression in Schoolchildren: Group CBT, Moral Education, Bioneurological Intervention
Authors: Annis Lai Chu Fung
Abstract:
While aggression had high stability throughout developmental stages and across generations, it should be the top priority of researchers and frontline helping professionals to develop prevention and intervention programme for aggressive children and children at risk of developing aggressive behaviours. Although there is a substantial amount of anti-bullying programmes, they gave disappointingly small effect sizes. The neglectful practical significance could be attributed to the overly simplistic categorisation of individuals involved as bullies or victims. In the past three decades, the distinction between reactive and proactive aggression has been well-proved. As children displaying reactively aggressive behaviours have distinct social-information processing pattern with those showing proactively aggressive behaviours, it is critical to identify the unique needs of the two subtypes accordingly when designing an intervention. The onset of reactive aggression and proactive aggression was observed at earliest in 4.4 and 6.8 years old respectively. Such findings called for a differential early intervention targeting these high-risk children. However, to the best of the author’s knowledge, the author was the first to establish an evidence-based intervention programme against reactive and proactive aggression. With the largest samples in the world, the author, in the past 10 years, explored three different approaches and their effectiveness against aggression quantitatively and qualitatively with longitudinal design. The three approaches presented are (a) cognitive-behavioral approach, (b) moral education, with Chinese marital arts and ethics as the medium, and (c) bioneurological measures (omega-3 supplementation). The studies adopted a multi-informant approach with repeated measures before and after the intervention, and follow-up assessment. Participants were recruited from primary and secondary schools in Hong Kong. In the cognitive-behavioral approach, 66 reactive aggressors and 63 proactive aggressors, aged from 11 to 17, were identified from 10,096 secondary-school children with questionnaire and subsequent structured interview. Participants underwent 10 group sessions specifically designed for each subtype of aggressor. Results revealed significant declines in aggression levels from the baseline to the follow-up assessment after 1 year. In moral education through the Chinese martial arts, 315 high-risk aggressive children, aged 6 to 12 years, were selected from 3,511 primary-school children and randomly assigned into four types of 10-session intervention group, namely martial-skills-only, martial-ethics-only, both martial-skills-and-ethics, and physical fitness (placebo). Results showed only the martial-skills-and-ethics group had a significant reduction in aggression after treatment and 6 months after treatment comparing with the placebo group. In the bioneurological approach, 218 children, aged from 8 to 17, were randomly assigned to the omega-3 supplement group and the placebo group. Results revealed that compared with the placebo group, the omega-3 supplement group had significant declines in aggression levels at the 6-month follow-up assessment. All three approaches were effective in reducing proactive and reactive aggression. Traditionally, intervention programmes against aggressive behaviour often adapted the cognitive and/or behavioural approach. However, cognitive-behavioural approach for children was recently challenged by its demanding requirement of cognitive ability. Traditional cognitive interventions may not be as beneficial to an older population as in young children. The present study offered an insightful perspective in aggression reduction measures.Keywords: intervention, outcome effectiveness, proactive aggression, reactive aggression
Procedia PDF Downloads 22224 Classification Using Worldview-2 Imagery of Giant Panda Habitat in Wolong, Sichuan Province, China
Authors: Yunwei Tang, Linhai Jing, Hui Li, Qingjie Liu, Xiuxia Li, Qi Yan, Haifeng Ding
Abstract:
The giant panda (Ailuropoda melanoleuca) is an endangered species, mainly live in central China, where bamboos act as the main food source of wild giant pandas. Knowledge of spatial distribution of bamboos therefore becomes important for identifying the habitat of giant pandas. There have been ongoing studies for mapping bamboos and other tree species using remote sensing. WorldView-2 (WV-2) is the first high resolution commercial satellite with eight Multi-Spectral (MS) bands. Recent studies demonstrated that WV-2 imagery has a high potential in classification of tree species. The advanced classification techniques are important for utilising high spatial resolution imagery. It is generally agreed that object-based image analysis is a more desirable method than pixel-based analysis in processing high spatial resolution remotely sensed data. Classifiers that use spatial information combined with spectral information are known as contextual classifiers. It is suggested that contextual classifiers can achieve greater accuracy than non-contextual classifiers. Thus, spatial correlation can be incorporated into classifiers to improve classification results. The study area is located at Wuyipeng area in Wolong, Sichuan Province. The complex environment makes it difficult for information extraction since bamboos are sparsely distributed, mixed with brushes, and covered by other trees. Extensive fieldworks in Wuyingpeng were carried out twice. The first one was on 11th June, 2014, aiming at sampling feature locations for geometric correction and collecting training samples for classification. The second fieldwork was on 11th September, 2014, for the purposes of testing the classification results. In this study, spectral separability analysis was first performed to select appropriate MS bands for classification. Also, the reflectance analysis provided information for expanding sample points under the circumstance of knowing only a few. Then, a spatially weighted object-based k-nearest neighbour (k-NN) classifier was applied to the selected MS bands to identify seven land cover types (bamboo, conifer, broadleaf, mixed forest, brush, bare land, and shadow), accounting for spatial correlation within classes using geostatistical modelling. The spatially weighted k-NN method was compared with three alternatives: the traditional k-NN classifier, the Support Vector Machine (SVM) method and the Classification and Regression Tree (CART). Through field validation, it was proved that the classification result obtained using the spatially weighted k-NN method has the highest overall classification accuracy (77.61%) and Kappa coefficient (0.729); the producer’s accuracy and user’s accuracy achieve 81.25% and 95.12% for the bamboo class, respectively, also higher than the other methods. Photos of tree crowns were taken at sample locations using a fisheye camera, so the canopy density could be estimated. It is found that it is difficult to identify bamboo in the areas with a large canopy density (over 0.70); it is possible to extract bamboos in the areas with a median canopy density (from 0.2 to 0.7) and in a sparse forest (canopy density is less than 0.2). In summary, this study explores the ability of WV-2 imagery for bamboo extraction in a mountainous region in Sichuan. The study successfully identified the bamboo distribution, providing supporting knowledge for assessing the habitats of giant pandas.Keywords: bamboo mapping, classification, geostatistics, k-NN, worldview-2
Procedia PDF Downloads 31323 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 24422 Solid State Fermentation: A Technological Alternative for Enriching Bioavailability of Underutilized Crops
Authors: Vipin Bhandari, Anupama Singh, Kopal Gupta
Abstract:
Solid state fermentation, an eminent bioconversion technique for converting many biological substrates into a value-added product, has proven its role in the biotransformation of crops by nutritionally enriching them. Hence, an effort was made for nutritional enhancement of underutilized crops viz. barnyard millet, amaranthus and horse gram based composite flour using SSF. The grains were given pre-treatments before fermentation and these pre-treatments proved quite effective in diminishing the level of antinutrients in grains and in improving their nutritional characteristics. The present study deals with the enhancement of nutritional characteristics of underutilized crops viz. barnyard millet, amaranthus and horsegram based composite flour using solid state fermentation (SSF) as the principle bioconversion technique to convert the composite flour substrate into a nutritionally enriched value added product. Response surface methodology was used to design the experiments. The variables selected for the fermentation experiments were substrate particle size, substrate blend ratio, fermentation time, fermentation temperature and moisture content having three levels of each. Seventeen designed experiments were conducted randomly to find the effect of these variables on microbial count, reducing sugar, pH, total sugar, phytic acid and water absorption index. The data from all experiments were analyzed using Design Expert 8.0.6 and the response functions were developed using multiple regression analysis and second order models were fitted for each response. Results revealed that pretreatments proved quite handful in diminishing the level of antinutrients and thus enhancing the nutritional value of the grains appreciably, for instance, there was about 23% reduction in phytic acid levels after decortication of barnyard millet. The carbohydrate content of the decorticated barnyard millet increased to 81.5% from initial value of 65.2%. Similarly popping and puffing of horsegram and amaranthus respectively greatly reduced the trypsin inhibitor activity. Puffing of amaranthus also reduced the tannin content appreciably. Bacillus subtilis was used as the inoculating specie since it is known to produce phytases in solid state fermentation systems. These phytases remarkably reduce the phytic acid content which acts as a major antinutritional factor in food grains. Results of solid state fermentation experiments revealed that phytic acid levels reduced appreciably when fermentation was allowed to continue for 72 hours at a temperature of 35°C. Particle size and substrate blend ratio also affected the responses positively. All the parameters viz. substrate particle size, substrate blend ratio, fermentation time, fermentation temperature and moisture content affected the responses namely microbial count, reducing sugar, pH, total sugar, phytic acid and water absorption index but the effect of fermentation time was found to be most significant on all the responses. Statistical analysis resulted in the optimum conditions (particle size 355µ, substrate blend ratio 50:20:30 of barnyard millet, amaranthus and horsegram respectively, fermentation time 68 hrs, fermentation temperature 35°C and moisture content 47%) for maximum reduction in phytic acid. The model F- value was found to be highly significant at 1% level of significance in case of all the responses. Hence, second order model could be fitted to predict all the dependent parameters. The effect of fermentation time was found to be most significant as compared to other variables.Keywords: composite flour, solid state fermentation, underutilized crops, cereals, fermentation technology, food processing
Procedia PDF Downloads 32721 Sinhala Sign Language to Grammatically Correct Sentences using NLP
Authors: Anjalika Fernando, Banuka Athuraliya
Abstract:
This paper presents a comprehensive approach for converting Sinhala Sign Language (SSL) into grammatically correct sentences using Natural Language Processing (NLP) techniques in real-time. While previous studies have explored various aspects of SSL translation, the research gap lies in the absence of grammar checking for SSL. This work aims to bridge this gap by proposing a two-stage methodology that leverages deep learning models to detect signs and translate them into coherent sentences, ensuring grammatical accuracy. The first stage of the approach involves the utilization of a Long Short-Term Memory (LSTM) deep learning model to recognize and interpret SSL signs. By training the LSTM model on a dataset of SSL gestures, it learns to accurately classify and translate these signs into textual representations. The LSTM model achieves a commendable accuracy rate of 94%, demonstrating its effectiveness in accurately recognizing and translating SSL gestures. Building upon the successful recognition and translation of SSL signs, the second stage of the methodology focuses on improving the grammatical correctness of the translated sentences. The project employs a Neural Machine Translation (NMT) architecture, consisting of an encoder and decoder with LSTM components, to enhance the syntactical structure of the generated sentences. By training the NMT model on a parallel corpus of Sinhala wrong sentences and their corresponding grammatically correct translations, it learns to generate coherent and grammatically accurate sentences. The NMT model achieves an impressive accuracy rate of 98%, affirming its capability to produce linguistically sound translations. The proposed approach offers significant contributions to the field of SSL translation and grammar correction. Addressing the critical issue of grammar checking, it enhances the usability and reliability of SSL translation systems, facilitating effective communication between hearing-impaired and non-sign language users. Furthermore, the integration of deep learning techniques, such as LSTM and NMT, ensures the accuracy and robustness of the translation process. This research holds great potential for practical applications, including educational platforms, accessibility tools, and communication aids for the hearing-impaired. Furthermore, it lays the foundation for future advancements in SSL translation systems, fostering inclusive and equal opportunities for the deaf community. Future work includes expanding the existing datasets to further improve the accuracy and generalization of the SSL translation system. Additionally, the development of a dedicated mobile application would enhance the accessibility and convenience of SSL translation on handheld devices. Furthermore, efforts will be made to enhance the current application for educational purposes, enabling individuals to learn and practice SSL more effectively. Another area of future exploration involves enabling two-way communication, allowing seamless interaction between sign-language users and non-sign-language users.In conclusion, this paper presents a novel approach for converting Sinhala Sign Language gestures into grammatically correct sentences using NLP techniques in real time. The two-stage methodology, comprising an LSTM model for sign detection and translation and an NMT model for grammar correction, achieves high accuracy rates of 94% and 98%, respectively. By addressing the lack of grammar checking in existing SSL translation research, this work contributes significantly to the development of more accurate and reliable SSL translation systems, thereby fostering effective communication and inclusivity for the hearing-impaired communityKeywords: Sinhala sign language, sign Language, NLP, LSTM, NMT
Procedia PDF Downloads 10420 Determination of the Phytochemicals Composition and Pharmacokinetics of whole Coffee Fruit Caffeine Extract by Liquid Chromatography-Tandem Mass Spectrometry
Authors: Boris Nemzer, Nebiyu Abshiru, Z. B. Pietrzkowski
Abstract:
Coffee cherry is one of the most ubiquitous agricultural commodities which possess nutritional and human health beneficial properties. Between the two most widely used coffee cherries Coffea arabica (Arabica) and Coffea canephora (Robusta), Coffea arabica remains superior due to its sensory properties and, therefore, remains in great demand in the global coffee market. In this study, the phytochemical contents and pharmacokinetics of Coffeeberry® Energy (CBE), a commercially available Arabica whole coffee fruit caffeine extract, are investigated. For phytochemical screening, 20 mg of CBE was dissolved in an aqueous methanol solution for analysis by mass spectrometry (MS). Quantification of caffeine and chlorogenic acids (CGAs) contents of CBE was performed using HPLC. For the bioavailability study, serum samples were collected from human subjects before and after 1, 2 and 3 h post-ingestion of 150mg CBE extract. Protein precipitation and extraction were carried out using methanol. Identification of compounds was performed using an untargeted metabolomic approach on Q-Exactive Orbitrap MS coupled to reversed-phase chromatography. Data processing was performed using Thermo Scientific Compound Discover 3.3 software. Phytochemical screening identified a total of 170 compounds, including organic acids, phenolic acids, CGAs, diterpenoids and hydroxytryptamine. Caffeine & CGAs make up more than, respectively, 70% & 9% of the total CBE composition. For serum samples, a total of 82 metabolites representing 32 caffeine- and 50 phenolic-derived metabolites were identified. Volcano plot analysis revealed 32 differential metabolites (24 caffeine- and 8 phenolic-derived) that showed an increase in serum level post-CBE dosing. Caffeine, uric acid, and trimethyluric acid isomers exhibited 4- to 10-fold increase in serum abundance post-dosing. 7-Methyluric acid, 1,7-dimethyluric acid, paraxanthine and theophylline exhibited a minimum of 1.5-fold increase in serum level. Among the phenolic-derived metabolites, iso-feruloyl quinic acid isomers (3-, 4- and 5-iFQA) showed the highest increase in serum level. These compounds were essentially absent in serum collected before dosage. More interestingly, the iFQA isomers were not originally present in the CBE extract, as our phytochemical screen did not identify these compounds. This suggests the potential formation of the isomers during the digestion and absorption processes. Pharmacokinetics parameters (Cmax, Tmax and AUC0-3h) of caffeine- and phenolic-derived metabolites were also investigated. Caffeine was rapidly absorbed, reaching a maximum concentration (Cmax) of 10.95 µg/ml in just 1 hour. Thereafter, caffeine level steadily dropped from the peak level, although it did not return to baseline within the 3-hour dosing period. The disappearance of caffeine from circulation was mirrored by the rise in the concentration of its methylxanthine metabolites. Similarly, serum concentration of iFQA isomers steadily increased, reaching maximum (Cmax: 3-iFQA, 1.54 ng/ml; 4-iFQA, 2.47 ng/ml; 5-iFQA, 2.91 ng/ml) at tmax of 1.5 hours. The isomers remained well above the baseline during the 3-hour dosing period, allowing them to remain in circulation long enough for absorption into the body. Overall, the current study provides evidence of the potential health benefits of a uniquely formulated whole coffee fruit product. Consumption of this product resulted in a distinct serum profile of bioactive compounds, as demonstrated by the more than 32 metabolites that exhibited a significant change in systemic exposure.Keywords: phytochemicals, mass spectrometry, pharmacokinetics, differential metabolites, chlorogenic acids
Procedia PDF Downloads 6919 Microencapsulation of Probiotic and Evaluation for Viability, Antimicrobial Property and Cytotoxic Activities of its Postbiotic Metabolites on MCF-7 Breast Cancer Cell Line
Authors: Nkechi V. Enwuru, Bullum Nkeki, Elizabeth A. Adekoya, Olumide A. Adebesin, Rebecca F. Peters, Victoria A. Aikhomu, Mendie E. U.
Abstract:
Background: Probiotics are live microbial feed supplement beneficial for host. Probiotics and their postbiotic products have been used to prevent or treat various health conditions. However, the products cell viability is often low due to harsh conditions subjected during processing, handling, storage, and gastrointestinal transit. These strongly influence probiotics’ benefits; thus, viability is essential for probiotics to produce health benefits for the host. Microencapsulation is a promising technique with considerable effects on probiotic survival. The study is aimed to formulate a microencapsulated probiotic and evaluate its viability, antimicrobial efficacy, and cytotoxic activity of its postbiotic on the MCF-7 breast cancer cell line. Method: Human and animal raw milk were sampled for lactic acid bacteria. The isolated bacteria were identified using conventional and VITEK 2 systems. The identified lactic acid bacterium was encapsulated using spray-dried and extrusion methods. The free, encapsulated, and chitosan-coated encapsulated probiotics were tested for viability in simulated-gastric intestinal (SGI) fluid and different storage conditions at refrigerated (4oC) and room (25oC) temperatures. The disintegration time and weight uniformity of the spray-dried hard gelatin capsules were tested. The antimicrobial property of free and encapsulated probiotics was tested against enteric pathogenic isolates from antiretroviral therapy (ART) treated HIV-positive patients. The postbiotic of the free cells was extracted, and its cytotoxic effect on the MCF-7 breast cancer cell line was tested through an MTT assay. Result: The Lactobacillus plantarum was isolated from animal raw milk. Zero-size hard gelatin L. plantarum capsules with granules within a size range of 0.71–1.00 mm diameter was formulated. The disintegration time ranges from 2.14±0.045 to 2.91±0.293 minutes, while the average weight is 502.1mg. Simulated gastric solution significantly affected viability of both free and microcapsules. However, the encapsulated cells were more protected and viable due to impermeability in the microcapsules. Furthermore, the viability of free cells stored at 4oC and 25oC were less than 4 log CFU/g and 6 log CFU/g respectively after 12 weeks. However, the microcapsules stored at 4oC achieved the highest viability among the free and microcapsules stored at 25oC and the free cells stored at 4oC. Encapsulated cells were released in the simulated gastric fluid, viable and effective against the enteric pathogens tested. However, chitosan-coated calcium alginate encapsulated probiotics significantly inhibited Shigella flexneri, Candida albicans, and Escherichia coli. The Postbiotic Metabolites (PM) of L. plantarum produced a cytotoxic effect on the MCF-7 breast cancer cell line. The postbiotic showed significant cytotoxic activity similar to 5FU, a standard antineoplastic agent. The inhibition concentration of 50% growth (IC50) of postbiotic metabolite K3 is low and consistent with the IC50 of the positive control (Cisplatin). Conclusions: Lactobacillus plantarum postbiotic exhibited a cytotoxic effect on the MCF-7 breast cancer cell line and could be used as combined adjuvant therapy in breast cancer management. The microencapsulation technique protects the probiotics, improving their viability and delivery to the gastrointestinal tract. Chitosan enhances antibacterial efficacy; thus, chitosan-coated microencapsulated L. plantarum probiotics could be more effective and used as a combined therapy in HIV management of opportunistic enteric infection.Keywords: probiotics, encapsulation, gastrointestinal conditions, antimicrobial effect, postbiotic, cytotoxicity effect
Procedia PDF Downloads 12318 ARGO: An Open Designed Unmanned Surface Vehicle Mapping Autonomous Platform
Authors: Papakonstantinou Apostolos, Argyrios Moustakas, Panagiotis Zervos, Dimitrios Stefanakis, Manolis Tsapakis, Nektarios Spyridakis, Mary Paspaliari, Christos Kontos, Antonis Legakis, Sarantis Houzouris, Konstantinos Topouzelis
Abstract:
For years unmanned and remotely operated robots have been used as tools in industry research and education. The rapid development and miniaturization of sensors that can be attached to remotely operated vehicles in recent years allowed industry leaders and researchers to utilize them as an affordable means for data acquisition in air, land, and sea. Despite the recent developments in the ground and unmanned airborne vehicles, a small number of Unmanned Surface Vehicle (USV) platforms are targeted for mapping and monitoring environmental parameters for research and industry purposes. The ARGO project is developed an open-design USV equipped with multi-level control hardware architecture and state-of-the-art sensors and payloads for the autonomous monitoring of environmental parameters in large sea areas. The proposed USV is a catamaran-type USV controlled over a wireless radio link (5G) for long-range mapping capabilities and control for a ground-based control station. The ARGO USV has a propulsion control using 2x fully redundant electric trolling motors with active vector thrust for omnidirectional movement, navigation with opensource autopilot system with high accuracy GNSS device, and communication with the 2.4Ghz digital link able to provide 20km of Line of Sight (Los) range distance. The 3-meter dual hull design and composite structure offer well above 80kg of usable payload capacity. Furthermore, sun and friction energy harvesting methods provide clean energy to the propulsion system. The design is highly modular, where each component or payload can be replaced or modified according to the desired task (industrial or research). The system can be equipped with Multiparameter Sonde, measuring up to 20 water parameters simultaneously, such as conductivity, salinity, turbidity, dissolved oxygen, etc. Furthermore, a high-end multibeam echo sounder can be installed in a specific boat datum for shallow water high-resolution seabed mapping. The system is designed to operate in the Aegean Sea. The developed USV is planned to be utilized as a system for autonomous data acquisition, mapping, and monitoring bathymetry and various environmental parameters. ARGO USV can operate in small or large ports with high maneuverability and endurance to map large geographical extends at sea. The system presents state of the art solutions in the following areas i) the on-board/real-time data processing/analysis capabilities, ii) the energy-independent and environmentally friendly platform entirely made using the latest aeronautical and marine materials, iii) the integration of advanced technology sensors, all in one system (photogrammetric and radiometric footprint, as well as its connection with various environmental and inertial sensors) and iv) the information management application. The ARGO web-based application enables the system to depict the results of the data acquisition process in near real-time. All the recorded environmental variables and indices are presented, allowing users to remotely access all the raw and processed information using the implemented web-based GIS application.Keywords: monitor marine environment, unmanned surface vehicle, mapping bythometry, sea environmental monitoring
Procedia PDF Downloads 13917 Applying Concept Mapping to Explore Temperature Abuse Factors in the Processes of Cold Chain Logistics Centers
Authors: Marco F. Benaglia, Mei H. Chen, Kune M. Tsai, Chia H. Hung
Abstract:
As societal and family structures, consumer dietary habits, and awareness about food safety and quality continue to evolve in most developed countries, the demand for refrigerated and frozen foods has been growing, and the issues related to their preservation have gained increasing attention. A well-established cold chain logistics system is essential to avoid any temperature abuse; therefore, assessing potential disruptions in the operational processes of cold chain logistics centers becomes pivotal. This study preliminarily employs HACCP to find disruption factors in cold chain logistics centers that may cause temperature abuse. Then, concept mapping is applied: selected experts engage in brainstorming sessions to identify any further factors. The panel consists of ten experts, including four from logistics and home delivery, two from retail distribution, one from the food industry, two from low-temperature logistics centers, and one from the freight industry. Disruptions include equipment-related aspects, human factors, management aspects, and process-related considerations. The areas of observation encompass freezer rooms, refrigerated storage areas, loading docks, sorting areas, and vehicle parking zones. The experts also categorize the disruption factors based on perceived similarities and build a similarity matrix. Each factor is evaluated for its impact, frequency, and investment importance. Next, multiple scale analysis, cluster analysis, and other methods are used to analyze these factors. Simultaneously, key disruption factors are identified based on their impact and frequency, and, subsequently, the factors that companies prioritize and are willing to invest in are determined by assessing investors’ risk aversion behavior. Finally, Cumulative Prospect Theory (CPT) is applied to verify the risk patterns. 66 disruption factors are found and categorized into six clusters: (1) "Inappropriate Use and Maintenance of Hardware and Software Facilities", (2) "Inadequate Management and Operational Negligence", (3) "Product Characteristics Affecting Quality and Inappropriate Packaging", (4) "Poor Control of Operation Timing and Missing Distribution Processing", (5) "Inadequate Planning for Peak Periods and Poor Process Planning", and (6) "Insufficient Cold Chain Awareness and Inadequate Training of Personnel". This study also identifies five critical factors in the operational processes of cold chain logistics centers: "Lack of Personnel’s Awareness Regarding Cold Chain Quality", "Personnel Not Following Standard Operating Procedures", "Personnel’s Operational Negligence", "Management’s Inadequacy", and "Lack of Personnel’s Knowledge About Cold Chain". The findings show that cold chain operators prioritize prevention and improvement efforts in the "Inappropriate Use and Maintenance of Hardware and Software Facilities" cluster, particularly focusing on the factors of "Temperature Setting Errors" and "Management’s Inadequacy". However, through the application of CPT theory, this study reveals that companies are not usually willing to invest in the improvement of factors related to the "Inappropriate Use and Maintenance of Hardware and Software Facilities" cluster due to its low occurrence likelihood, but they acknowledge the severity of the consequences if it does occur. Hence, the main implication is that the key disruption factors in cold chain logistics centers’ processes are associated with personnel issues; therefore, comprehensive training, periodic audits, and the establishment of reasonable incentives and penalties for both new employees and managers may significantly reduce disruption issues.Keywords: concept mapping, cold chain, HACCP, cumulative prospect theory
Procedia PDF Downloads 6916 The Pro-Reparative Effect of Vasoactive Intestinal Peptide in Chronic Inflammatory Osteolytic Periapical Lesions
Authors: Michelle C. S. Azevedo, Priscila M. Colavite, Carolina F. Francisconi, Ana P. Trombone, Gustavo P. Garlet
Abstract:
VIP (vasoactive intestinal peptide) know as a potential protective factor in the view of its marked immunosuppressive properties. In this work, we investigated a possible association of VIP with the clinical status of experimental periapical granulomas and the association with expression markers in the lesions potentially associated with periapical lesions pathogenesis. C57BL/6WT mice were treated or not with recombinant VIP. Animals with active/progressive (N=40), inactive/stable (N=70) periapical granulomas and controls (N=50) were anesthetized and the right mandibular first molar was surgically opened, allowing exposure of dental pulp. Endodontic pathogenic bacterial strains were inoculated: Porphyromonas gingivalis, Prevotella nigrescens, Actinomyces viscosus, and Fusobacterium nucleatum subsp. polymorphum. The cavity was not sealed after bacterial inoculation. During lesion development, animals were treated or not with recombinant VIP 3 days post infection. Animals were killed after 3, 7, 14, and 21 days of infection and the jaws were dissected. The extraction of total RNA from periodontal tissues was performed and the integrity of samples was checked. qPCR reaction using TaqMan chemistry with inventoried primers were performed in ViiA7 equipment. The results, depicted as the relative levels of gene expression, were calculated in reference to GAPDH and β-actin expression. Periodontal tissues from upper molars were vested and incubated supplemented RPMI, followed by processing with 0.05% DNase. Cell viability and couting were determined by Neubauer chamber analysis. For flow cytometry analysis, after cell counting the cells were stained with the optimal dilution of each antibody; (PE)-conjugated and (FITC)-conjugated antibodies against CD4, CD25, FOXP3, IL-4, IL-17 and IFN-γ antibodies, as well their respective isotype controls. Cells were analyzed by FACScan and CellQuest software. Results are presented as the number of cells in the periodontal tissues or the number of positive cells for each marker in the CD4+FOXp3+, CD4+IL-4+, CD4+IFNg+ and CD4+IL-17+ subpopulations. The levels mRNA were measured by qPCR. The VIP expression was predominated in inactive lesions, as well part of the clusters of cytokine/Th markers identified as protective factors and a negative correlation between VIP expression and lesion evolution was observed. A quantitative analysis of IL1β, IL17, TNF, IFN, MMP2, RANKL, OPG, IL10, TGFβ, CTLA4, COL5A1, CTGF, CXCL11, FGF7, ITGA4, ITGA5, SERP1 and VTN expression was measured in experimental periapical lesions treated with VIP 7 and 14 days after lesion induction and healthy animals. After 7 days, all targets presented a significate increase in comparison to untreated animals. About migration kinetics, profile of chemokine receptors expression of TCD4+ subsets and phenotypic analysis of Tregs, Th1, Th2 and Th17 cells during the course of experimental periodontal disease evaluated by flow cytometry and depicted as the number of positive cells for each marker. CD4+IFNg+ and CD4+FOXp3+ cells migration were significate increased 7 days post VIP treatment. CD4+IL17+ cells migration were significate increased 7 and 14 days post VIP treatment, CD4+IL4+ cells migration were significate increased 14 and 21 days post VIP treatment compared to the control group. In conclusion, our experimental data support VIP involvement in determining the inactivity of periapical lesions. Financial support: FAPESP #2015/25618-2.Keywords: chronic inflammation, cytokines, osteolytic lesions, VIP (Vasoactive Intestinal Peptide)
Procedia PDF Downloads 19315 Carbon Nanotube-Based Catalyst Modification to Improve Proton Exchange Membrane Fuel Cell Interlayer Interactions
Authors: Ling Ai, Ziyu Zhao, Zeyu Zhou, Xiaochen Yang, Heng Zhai, Stuart Holmes
Abstract:
Optimizing the catalyst layer structure is crucial for enhancing the performance of proton exchange membrane fuel cells (PEMFCs) with low Platinum (Pt) loading. Current works focused on the utilization, durability, and site activity of Pt particles on support, and performance enhancement has been achieved by loading Pt onto porous support with different morphology, such as graphene, carbon fiber, and carbon black. Some schemes have also incorporated cost considerations to achieve lower Pt loading. However, the design of the catalyst layer (CL) structure in the membrane electrode assembly (MEA) must consider the interactions between the layers. Addressing the crucial aspects of water management, low contact resistance, and the establishment of effective three-phase boundary for MEA, multi-walled carbon nanotubes (MWCNTs) are promising CL support due to their intrinsically high hydrophobicity, high axial electrical conductivity, and potential for ordered alignment. However, the drawbacks of MWCNTs, such as strong agglomeration, wall surface chemical inertness, and unopened ends, are unfavorable for Pt nanoparticle loading, which is detrimental to MEA processing and leads to inhomogeneous CL surfaces. This further deteriorates the utilization of Pt and increases the contact resistance. Robust chemical oxidation or nitrogen doping can introduce polar functional groups onto the surface of MWCNTs, facilitating the creation of open tube ends and inducing defects in tube walls. This improves dispersibility and load capacity but reduces length and conductivity. Consequently, a trade-off exists between maintaining the intrinsic properties and the degree of functionalization of MWCNTs. In this work, MWCNTs were modified based on the operational requirements of the MEA from the viewpoint of interlayer interactions, including the search for the optimal degree of oxidation, N-doping, and micro-arrangement. MWCNT were functionalized by oxidizing, N-doping, as well as micro-alignment to achieve lower contact resistance between CL and proton exchange membrane (PEM), better hydrophobicity, and enhanced performance. Furthermore, this work expects to construct a more continuously distributed three-phase boundary by aligning MWCNT to form a locally ordered structure, which is essential for the efficient utilization of Pt active sites. Different from other chemical oxidation schemes that used HNO3:H2SO4 (1:3) mixed acid to strongly oxidize MWCNT, this scheme adopted pure HNO3 to partially oxidize MWCNT at a lower reflux temperature (80 ℃) and a shorter treatment time (0 to 10 h) to preserve the morphology and intrinsic conductivity of MWCNT. The maximum power density of 979.81 mw cm-2 was achieved by Pt loading on 6h MWCNT oxidation time (Pt-MWCNT6h). This represented a 59.53% improvement over the commercial Pt/C catalyst of 614.17 (mw cm-2). In addition, due to the stronger electrical conductivity, the charge transfer resistance of Pt-MWCNT6h in the electrochemical impedance spectroscopy (EIS) test was 0.09 Ohm cm-2, which was 48.86% lower than that of Pt/C. This study will discuss the developed catalysts and their efficacy in a working fuel cell system. This research will validate the impact of low-functionalization modification of MWCNTs on the performance of PEMFC, which simplifies the preparation challenges of CL and contributing for the widespread commercial application of PEMFCs on a larger scale.Keywords: carbon nanotubes, electrocatalyst, membrane electrode assembly, proton exchange membrane fuel cell
Procedia PDF Downloads 6914 EcoTeka, an Open-Source Software for Urban Ecosystem Restoration through Technology
Authors: Manon Frédout, Laëtitia Bucari, Mathias Aloui, Gaëtan Duhamel, Olivier Rovellotti, Javier Blanco
Abstract:
Ecosystems must be resilient to ensure cleaner air, better water and soil quality, and thus healthier citizens. Technology can be an excellent tool to support urban ecosystem restoration projects, especially when based on Open Source and promoting Open Data. This is the goal of the ecoTeka application: one single digital tool for tree management which allows decision-makers to improve their urban forestry practices, enabling more responsible urban planning and climate change adaptation. EcoTeka provides city councils with three main functionalities tackling three of their challenges: easier biodiversity inventories, better green space management, and more efficient planning. To answer the cities’ need for reliable tree inventories, the application has been first built with open data coming from the websites OpenStreetMap and OpenTrees, but it will also include very soon the possibility of creating new data. To achieve this, a multi-source algorithm will be elaborated, based on existing artificial intelligence Deep Forest, integrating open-source satellite images, 3D representations from LiDAR, and street views from Mapillary. This data processing will permit identifying individual trees' position, height, crown diameter, and taxonomic genus. To support urban forestry management, ecoTeka offers a dashboard for monitoring the city’s tree inventory and trigger alerts to inform about upcoming due interventions. This tool was co-constructed with the green space departments of the French cities of Alès, Marseille, and Rouen. The third functionality of the application is a decision-making tool for urban planning, promoting biodiversity and landscape connectivity metrics to drive ecosystem restoration roadmap. Based on landscape graph theory, we are currently experimenting with new methodological approaches to scale down regional ecological connectivity principles to local biodiversity conservation and urban planning policies. This methodological framework will couple graph theoretic approach and biological data, mainly biodiversity occurrences (presence/absence) data available on both international (e.g., GBIF), national (e.g., Système d’Information Nature et Paysage) and local (e.g., Atlas de la Biodiversté Communale) biodiversity data sharing platforms in order to help reasoning new decisions for ecological networks conservation and restoration in urban areas. An experiment on this subject is currently ongoing with Montpellier Mediterranee Metropole. These projects and studies have shown that only 26% of tree inventory data is currently geo-localized in France - the rest is still being done on paper or Excel sheets. It seems that technology is not yet used enough to enrich the knowledge city councils have about biodiversity in their city and that existing biodiversity open data (e.g., occurrences, telemetry, or genetic data), species distribution models, landscape graph connectivity metrics are still underexploited to make rational decisions for landscape and urban planning projects. This is the goal of ecoTeka: to support easier inventories of urban biodiversity and better management of urban spaces through rational planning and decisions relying on open databases. Future studies and projects will focus on the development of tools for reducing the artificialization of soils, selecting plant species adapted to climate change, and highlighting the need for ecosystem and biodiversity services in cities.Keywords: digital software, ecological design of urban landscapes, sustainable urban development, urban ecological corridor, urban forestry, urban planning
Procedia PDF Downloads 7013 Mobi-DiQ: A Pervasive Sensing System for Delirium Risk Assessment in Intensive Care Unit
Authors: Subhash Nerella, Ziyuan Guan, Azra Bihorac, Parisa Rashidi
Abstract:
Intensive care units (ICUs) provide care to critically ill patients in severe and life-threatening conditions. However, patient monitoring in the ICU is limited by the time and resource constraints imposed on healthcare providers. Many critical care indices such as mobility are still manually assessed, which can be subjective, prone to human errors, and lack granularity. Other important aspects, such as environmental factors, are not monitored at all. For example, critically ill patients often experience circadian disruptions due to the absence of effective environmental “timekeepers” such as the light/dark cycle and the systemic effect of acute illness on chronobiologic markers. Although the occurrence of delirium is associated with circadian disruption risk factors, these factors are not routinely monitored in the ICU. Hence, there is a critical unmet need to develop systems for precise and real-time assessment through novel enabling technologies. We have developed the mobility and circadian disruption quantification system (Mobi-DiQ) by augmenting biomarker and clinical data with pervasive sensing data to generate mobility and circadian cues related to mobility, nightly disruptions, and light and noise exposure. We hypothesize that Mobi-DiQ can provide accurate mobility and circadian cues that correlate with bedside clinical mobility assessments and circadian biomarkers, ultimately important for delirium risk assessment and prevention. The collected multimodal dataset consists of depth images, Electromyography (EMG) data, patient extremity movement captured by accelerometers, ambient light levels, Sound Pressure Level (SPL), and indoor air quality measured by volatile organic compounds, and the equivalent CO₂ concentration. For delirium risk assessment, the system recognizes mobility cues (axial body movement features and body key points) and circadian cues, including nightly disruptions, ambient SPL, and light intensity, as well as other environmental factors such as indoor air quality. The Mobi-DiQ system consists of three major components: the pervasive sensing system, a data storage and analysis server, and a data annotation system. For data collection, six local pervasive sensing systems were deployed, including a local computer and sensors. A video recording tool with graphical user interface (GUI) developed in python was used to capture depth image frames for analyzing patient mobility. All sensor data is encrypted, then automatically uploaded to the Mobi-DiQ server through a secured VPN connection. Several data pipelines are developed to automate the data transfer, curation, and data preparation for annotation and model training. The data curation and post-processing are performed on the server. A custom secure annotation tool with GUI was developed to annotate depth activity data. The annotation tool is linked to the MongoDB database to record the data annotation and to provide summarization. Docker containers are also utilized to manage services and pipelines running on the server in an isolated manner. The processed clinical data and annotations are used to train and develop real-time pervasive sensing systems to augment clinical decision-making and promote targeted interventions. In the future, we intend to evaluate our system as a clinical implementation trial, as well as to refine and validate it by using other data sources, including neurological data obtained through continuous electroencephalography (EEG).Keywords: deep learning, delirium, healthcare, pervasive sensing
Procedia PDF Downloads 9312 A Case Report on the Course and Outcome of a Patient Diagnosed with Trichotillomania and Major Depressive Disorder
Authors: Ziara Carmelli G. Tan, Irene Carmelle S. Tan
Abstract:
Background: Trichotillomania (TTM) and Major Depressive Disorder (MDD) are two psychiatric conditions that frequently co-occur, presenting a significant challenge for treatment due to their complex interplay. TTM involves repetitive hair-pulling, leading to noticeable hair loss and distress, while MDD is characterized by persistent low mood and loss of interest or pleasure, leading to dysfunctionality. This case report examines the intricate relationship between TTM and MDD in a young adult female, emphasizing the need for a comprehensive, multifaceted therapeutic approach to address both disorders effectively. Case Presentation: The patient is a 21-year-old female college student and youth church leader who presented with chronic hair-pulling and depressive symptoms. Her premorbid personality was marked by low self-esteem and a strong need for external validation. Despite her academic and social responsibilities and achievements, she struggled with managing her emotional distress, which was exacerbated by her family dynamics and her role within her church community. Her hair-pulling and mood symptoms were particularly triggered by self-esteem threats and feelings of inadequacy. She was diagnosed with Trichotillomania, Scalp and Major Depressive Disorder. Intervention/Management: The patient’s treatment plan was comprehensive, incorporating both pharmacological and non-pharmacological interventions. Initial pharmacologic management was Fluoxetine 20mg/day up, titrated to 40mg/day with no improvement; hence, shifted to Escitalopram 20mg/day and started with N-acetylcysteine 600mg/day with noted significant improvement in symptoms. Psychotherapeutic strategies played a crucial role in her treatment. These included supportive-expressive psychodynamic psychotherapy, which helped her explore and understand underlying emotional conflicts. Cognitive-behavioral techniques were employed to modify her maladaptive thoughts and behaviors. Grief processing was integrated to help her cope with significant losses. Family therapy was done to address conflicts and collaborate with the treatment process. Psychoeducation was provided to enhance her understanding of her condition and to empower her in her treatment journey. A suicide safety plan was developed to ensure her safety during critical periods. An interprofessional approach, which involved coordination with the Dermatology service for co-management, was also a key component of her treatment. Outcome: Over the course of 15 therapy sessions, the patient demonstrated significant improvement in both her depressive symptoms and hair-pulling behavior. Her active engagement in therapy, combined with pharmacological support, facilitated better emotional regulation and a more cohesive sense of self. Her adherence to the treatment plan, along with the collaborative efforts of the interprofessional team, contributed to her positive outcomes. Discussion: This case underscores the significance of addressing both TTM and its comorbid conditions to achieve effective treatment outcomes. The intricate interplay between TTM and MDD in the patient’s case highlights the importance of a comprehensive treatment plan that includes both pharmacological and psychotherapeutic approaches. Supportive-expressive psychodynamic psychotherapy, Cognitive-behavioral techniques, and Family therapy were particularly beneficial in addressing the complex emotional and behavioral aspects of her condition. The involvement of an interprofessional team, including dermatology co-management, was crucial in providing holistic care. Future practice should consider the benefits of such a multidisciplinary approach to managing complex cases like this, ensuring that both the psychological and physiological aspects of the disorders are adequately addressed.Keywords: cognitive-behavioral therapy, interprofessional approach, major depressive disorder, psychodynamic psychotherapy, trichotillomania
Procedia PDF Downloads 3011 The Use of Rule-Based Cellular Automata to Track and Forecast the Dispersal of Classical Biocontrol Agents at Scale, with an Application to the Fopius arisanus Fruit Fly Parasitoid
Authors: Agboka Komi Mensah, John Odindi, Elfatih M. Abdel-Rahman, Onisimo Mutanga, Henri Ez Tonnang
Abstract:
Ecosystems are networks of organisms and populations that form a community of various species interacting within their habitats. Such habitats are defined by abiotic and biotic conditions that establish the initial limits to a population's growth, development, and reproduction. The habitat’s conditions explain the context in which species interact to access resources such as food, water, space, shelter, and mates, allowing for feeding, dispersal, and reproduction. Dispersal is an essential life-history strategy that affects gene flow, resource competition, population dynamics, and species distributions. Despite the importance of dispersal in population dynamics and survival, understanding the mechanism underpinning the dispersal of organisms remains challenging. For instance, when an organism moves into an ecosystem for survival and resource competition, its progression is highly influenced by extrinsic factors such as its physiological state, climatic variables and ability to evade predation. Therefore, greater spatial detail is necessary to understand organism dispersal dynamics. Understanding organisms dispersal can be addressed using empirical and mechanistic modelling approaches, with the adopted approach depending on the study's purpose Cellular automata (CA) is an example of these approaches that have been successfully used in biological studies to analyze the dispersal of living organisms. Cellular automata can be briefly described as occupied cells by an individual that evolves based on proper decisions based on a set of neighbours' rules. However, in the ambit of modelling individual organisms dispersal at the landscape scale, we lack user friendly tools that do not require expertise in mathematical models and computing ability; such as a visual analytics framework for tracking and forecasting the dispersal behaviour of organisms. The term "visual analytics" (VA) describes a semiautomated approach to electronic data processing that is guided by users who can interact with data via an interface. Essentially, VA converts large amounts of quantitative or qualitative data into graphical formats that can be customized based on the operator's needs. Additionally, this approach can be used to enhance the ability of users from various backgrounds to understand data, communicate results, and disseminate information across a wide range of disciplines. To support effective analysis of the dispersal of organisms at the landscape scale, we therefore designed Pydisp which is a free visual data analytics tool for spatiotemporal dispersal modeling built in Python. Its user interface allows users to perform a quick and interactive spatiotemporal analysis of species dispersal using bioecological and climatic data. Pydisp enables reuse and upgrade through the use of simple principles such as Fuzzy cellular automata algorithms. The potential of dispersal modeling is demonstrated in a case study by predicting the dispersal of Fopius arisanus (Sonan), endoparasitoids to control Bactrocera dorsalis (Hendel) (Diptera: Tephritidae) in Kenya. The results obtained from our example clearly illustrate the parasitoid's dispersal process at the landscape level and confirm that dynamic processes in an agroecosystem are better understood when designed using mechanistic modelling approaches. Furthermore, as demonstrated in the example, the built software is highly effective in portraying the dispersal of organisms despite the unavailability of detailed data on the species dispersal mechanisms.Keywords: cellular automata, fuzzy logic, landscape, spatiotemporal
Procedia PDF Downloads 7710 Establishment of a Classifier Model for Early Prediction of Acute Delirium in Adult Intensive Care Unit Using Machine Learning
Authors: Pei Yi Lin
Abstract:
Objective: The objective of this study is to use machine learning methods to build an early prediction classifier model for acute delirium to improve the quality of medical care for intensive care patients. Background: Delirium is a common acute and sudden disturbance of consciousness in critically ill patients. After the occurrence, it is easy to prolong the length of hospital stay and increase medical costs and mortality. In 2021, the incidence of delirium in the intensive care unit of internal medicine was as high as 59.78%, which indirectly prolonged the average length of hospital stay by 8.28 days, and the mortality rate is about 2.22% in the past three years. Therefore, it is expected to build a delirium prediction classifier through big data analysis and machine learning methods to detect delirium early. Method: This study is a retrospective study, using the artificial intelligence big data database to extract the characteristic factors related to delirium in intensive care unit patients and let the machine learn. The study included patients aged over 20 years old who were admitted to the intensive care unit between May 1, 2022, and December 31, 2022, excluding GCS assessment <4 points, admission to ICU for less than 24 hours, and CAM-ICU evaluation. The CAMICU delirium assessment results every 8 hours within 30 days of hospitalization are regarded as an event, and the cumulative data from ICU admission to the prediction time point are extracted to predict the possibility of delirium occurring in the next 8 hours, and collect a total of 63,754 research case data, extract 12 feature selections to train the model, including age, sex, average ICU stay hours, visual and auditory abnormalities, RASS assessment score, APACHE-II Score score, number of invasive catheters indwelling, restraint and sedative and hypnotic drugs. Through feature data cleaning, processing and KNN interpolation method supplementation, a total of 54595 research case events were extracted to provide machine learning model analysis, using the research events from May 01 to November 30, 2022, as the model training data, 80% of which is the training set for model training, and 20% for the internal verification of the verification set, and then from December 01 to December 2022 The CU research event on the 31st is an external verification set data, and finally the model inference and performance evaluation are performed, and then the model has trained again by adjusting the model parameters. Results: In this study, XG Boost, Random Forest, Logistic Regression, and Decision Tree were used to analyze and compare four machine learning models. The average accuracy rate of internal verification was highest in Random Forest (AUC=0.86), and the average accuracy rate of external verification was in Random Forest and XG Boost was the highest, AUC was 0.86, and the average accuracy of cross-validation was the highest in Random Forest (ACC=0.77). Conclusion: Clinically, medical staff usually conduct CAM-ICU assessments at the bedside of critically ill patients in clinical practice, but there is a lack of machine learning classification methods to assist ICU patients in real-time assessment, resulting in the inability to provide more objective and continuous monitoring data to assist Clinical staff can more accurately identify and predict the occurrence of delirium in patients. It is hoped that the development and construction of predictive models through machine learning can predict delirium early and immediately, make clinical decisions at the best time, and cooperate with PADIS delirium care measures to provide individualized non-drug interventional care measures to maintain patient safety, and then Improve the quality of care.Keywords: critically ill patients, machine learning methods, delirium prediction, classifier model
Procedia PDF Downloads 759 Sustainable Antimicrobial Biopolymeric Food & Biomedical Film Engineering Using Bioactive AMP-Ag+ Formulations
Authors: Eduardo Lanzagorta Garcia, Chaitra Venkatesh, Romina Pezzoli, Laura Gabriela Rodriguez Barroso, Declan Devine, Margaret E. Brennan Fournet
Abstract:
New antimicrobial interventions are urgently required to combat rising global health and medical infection challenges. Here, an innovative antimicrobial technology, providing price competitive alternatives to antibiotics and readily integratable with currently technological systems is presented. Two cutting edge antimicrobial materials, antimicrobial peptides (AMPs) and uncompromised sustained Ag+ action from triangular silver nanoplates (TSNPs) reservoirs, are merged for versatile effective antimicrobial action where current approaches fail. Antimicrobial peptides (AMPs) exist widely in nature and have recently been demonstrated for broad spectrum of activity against bacteria, viruses, and fungi. TSNP’s are highly discrete, homogenous and readily functionisable Ag+ nanoreseviors that have a proven amenability for operation within in a wide range of bio-based settings. In a design for advanced antimicrobial sustainable plastics, antimicrobial TSNPs are formulated for processing within biodegradable biopolymers. Histone H5 AMP was selected for its reported strong antimicrobial action and functionalized with the TSNP (AMP-TSNP) in a similar fashion to previously reported TSNP biofunctionalisation methods. A synergy between the propensity of biopolymers for degradation and Ag+ release combined with AMP activity provides a novel mechanism for the sustained antimicrobial action of biopolymeric thin films. Nanoplates are transferred from aqueous phase to an organic solvent in order to facilitate integration within hydrophobic polymers. Extrusion is used in combination with calendering rolls to create thin polymerc film where the nanoplates are embedded onto the surface. The resultant antibacterial functional films are suitable to be adapted for food packing and biomedical applications. TSNP synthesis were synthesized by adapting a previously reported seed mediated approach. TSNP synthesis was scaled up for litre scale batch production and subsequently concentrated to 43 ppm using thermally controlled H2O removal. Nanoplates were transferred from aqueous phase to an organic solvent in order to facilitate integration within hydrophobic polymers. This was acomplised by functionalizing the TSNP with thiol terminated polyethylene glycol and using centrifugal force to transfer them to chloroform. Polycaprolactone (PCL) and Polylactic acid (PLA) were individually processed through extrusion, TSNP and AMP-TSNP solutions were sprayed onto the polymer immediately after exiting the dye. Calendering rolls were used to disperse and incorporate TSNP and TSNP-AMP onto the surface of the extruded films. Observation of the characteristic blue colour confirms the integrity of the TSNP within the films. Antimicrobial tests were performed by incubating Gram + and Gram – strains with treated and non-treated films, to evaluate if bacterial growth was reduced due to the presence of the TSNP. The resulting films successfully incorporated TSNP and AMP-TSNP. Reduced bacterial growth was observed for both Gram + and Gram – strains for both TSNP and AMP-TSNP compared with untreated films indicating antimicrobial action. The largest growth reduction was observed for AMP-TSNP treated films demonstrating the additional antimicrobial activity due to the presence of the AMPs. The potential of this technology to impede bacterial activity in food industry and medical surfaces will forge new confidence in the battle against antibiotic resistant bacteria, serving to greatly inhibit infections and facilitate patient recovery.Keywords: antimicrobial, biodegradable, peptide, polymer, nanoparticle
Procedia PDF Downloads 1168 Microfabrication and Non-Invasive Imaging of Porous Osteogenic Structures Using Laser-Assisted Technologies
Authors: Irina Alexandra Paun, Mona Mihailescu, Marian Zamfirescu, Catalin Romeo Luculescu, Adriana Maria Acasandrei, Cosmin Catalin Mustaciosu, Roxana Cristina Popescu, Maria Dinescu
Abstract:
A major concern in bone tissue engineering is to develop complex 3D architectures that mimic the natural cells environment, facilitate the cells growth in a defined manner and allow the flow transport of nutrients and metabolic waste. In particular, porous structures of controlled pore size and positioning are indispensable for growing human-like bone structures. Another concern is to monitor both the structures and the seeded cells with high spatial resolution and without interfering with the cells natural environment. The present approach relies on laser-based technologies employed for fabricating porous biomimetic structures that support the growth of osteoblast-like cells and for their non-invasive 3D imaging. Specifically, the porous structures were built by two photon polymerization –direct writing (2PP_DW) of the commercially available photoresists IL-L780, using the Photonic Professional 3D lithography system. The structures consist of vertical tubes with micrometer-sized heights and diameters, in a honeycomb-like spatial arrangement. These were fabricated by irradiating the IP-L780 photoresist with focused laser pulses with wavelength centered at 780 nm, 120 fs pulse duration and 80 MHz repetition rate. The samples were precisely scanned in 3D by piezo stages. The coarse positioning was done by XY motorized stages. The scanning path was programmed through a writing language (GWL) script developed by Nanoscribe. Following laser irradiation, the unexposed regions of the photoresist were washed out by immersing the samples in the Propylene Glycol Monomethyl Ether Acetate (PGMEA). The porous structures were seeded with osteoblast like MG-63 cells and their osteogenic potential was tested in vitro. The cell-seeded structures were analyzed in 3D using the digital holographic microscopy technique (DHM). DHM is a marker free and high spatial resolution imaging tool, where the hologram acquisition is performed non-invasively i.e. without interfering with the cells natural environment. Following hologram recording, a digital algorithm provided a 3D image of the sample, as well as information about its refractive index, which is correlated with the intracellular content. The axial resolution of the images went down to the nanoscale, while the temporal scales ranged from milliseconds up to hours. The hologram did not involve sample scanning and the whole image was available in one frame recorded going over 200μm field of view. The digital holograms processing provided 3D quantitative information on the porous structures and allowed a quantitative analysis of the cellular response in respect to the porous architectures. The cellular shape and dimensions were found to be influenced by the underlying micro relief. Furthermore, the intracellular content gave evidence on the beneficial role of the porous structures in promoting osteoblast differentiation. In all, the proposed laser-based protocol emerges as a promising tool for the fabrication and non-invasive imaging of porous constructs for bone tissue engineering. Acknowledgments: This work was supported by a grant of the Romanian Authority for Scientific Research and Innovation, CNCS-UEFISCDI, project PN-II-RU-TE-2014-4-2534 (contract 97 from 01/10/2015) and by UEFISCDI PN-II-PT-PCCA no. 6/2012. A part of this work was performed in the CETAL laser facility, supported by the National Program PN 16 47 - LAPLAS IV.Keywords: biomimetic, holography, laser, osteoblast, two photon polymerization
Procedia PDF Downloads 2737 Supplier Carbon Footprint Methodology Development for Automotive Original Equipment Manufacturers
Authors: Nur A. Özdemir, Sude Erkin, Hatice K. Güney, Cemre S. Atılgan, Enes Huylu, Hüseyin Y. Altıntaş, Aysemin Top, Özak Durmuş
Abstract:
Carbon emissions produced during a product’s life cycle, from extraction of raw materials up to waste disposal and market consumption activities are the major contributors to global warming. In the light of the science-based targets (SBT) leading the way to a zero-carbon economy for sustainable growth of the companies, carbon footprint reporting of the purchased goods has become critical for identifying hotspots and best practices for emission reduction opportunities. In line with Ford Otosan's corporate sustainability strategy, research was conducted to evaluate the carbon footprint of purchased products in accordance with Scope 3 of the Greenhouse Gas Protocol (GHG). The purpose of this paper is to develop a systematic and transparent methodology to calculate carbon footprint of the products produced by automotive OEMs (Original Equipment Manufacturers) within the context of automobile supply chain management. To begin with, primary material data were collected through IMDS (International Material Database System) corresponds to company’s three distinct types of vehicles including Light Commercial Vehicle (Courier), Medium Commercial Vehicle (Transit and Transit Custom), Heavy Commercial Vehicle (F-MAX). Obtained material data was classified as metals, plastics, liquids, electronics, and others to get insights about the overall material distribution of produced vehicles and matched to the SimaPro Ecoinvent 3 database which is one of the most extent versions for modelling material data related to the product life cycle. Product life cycle analysis was calculated within the framework of ISO 14040 – 14044 standards by addressing the requirements and procedures. A comprehensive literature review and cooperation with suppliers were undertaken to identify the production methods of parts used in vehicles and to find out the amount of scrap generated during part production. Cumulative weight and material information with related production process belonging the components were listed by multiplying with current sales figures. The results of the study show a key modelling on carbon footprint of products and processes based on a scientific approach to drive sustainable growth by setting straightforward, science-based emission reduction targets. Hence, this study targets to identify the hotspots and correspondingly provide broad ideas about our understanding of how to integrate carbon footprint estimates into our company's supply chain management by defining convenient actions in line with climate science. According to emission values arising from the production phase including raw material extraction and material processing for Ford OTOSAN vehicles subjected in this study, GHG emissions from the production of metals used for HCV, MCV and LCV account for more than half of the carbon footprint of the vehicle's production. Correspondingly, aluminum and steel have the largest share among all material types and achieving carbon neutrality in the steel and aluminum industry is of great significance to the world, which will also present an immense impact on the automobile industry. Strategic product sustainability plan which includes the use of secondary materials, conversion to green energy and low-energy process design is required to reduce emissions of steel, aluminum, and plastics due to the projected increase in total volume by 2030.Keywords: automotive, carbon footprint, IMDS, scope 3, SimaPro, sustainability
Procedia PDF Downloads 1086 Industrial Waste to Energy Technology: Engineering Biowaste as High Potential Anode Electrode for Application in Lithium-Ion Batteries
Authors: Pejman Salimi, Sebastiano Tieuli, Somayeh Taghavi, Michela Signoretto, Remo Proietti Zaccaria
Abstract:
Increasing the growth of industrial waste due to the large quantities of production leads to numerous environmental and economic challenges, such as climate change, soil and water contamination, human disease, etc. Energy recovery of waste can be applied to produce heat or electricity. This strategy allows for the reduction of energy produced using coal or other fuels and directly reduces greenhouse gas emissions. Among different factories, leather manufacturing plays a very important role in the whole world from the socio-economic point of view. The leather industry plays a very important role in our society from a socio-economic point of view. Even though the leather industry uses a by-product from the meat industry as raw material, it is considered as an activity demanding integrated prevention and control of pollution. Along the entire process from raw skins/hides to finished leather, a huge amount of solid and water waste is generated. Solid wastes include fleshings, raw trimmings, shavings, buffing dust, etc. One of the most abundant solid wastes generated throughout leather tanning is shaving waste. Leather shaving is a mechanical process that aims at reducing the tanned skin to a specific thickness before tanning and finishing. This product consists mainly of collagen and tanning agent. At present, most of the world's leather processing is chrome-tanned based. Consequently, large amounts of chromium-containing shaving wastes need to be treated. The major concern about the management of this kind of solid waste is ascribed to chrome content, which makes the conventional disposal methods, such as landfilling and incineration, not practicable. Therefore, many efforts have been developed in recent decades to promote eco-friendly/alternative leather production and more effective waste management. Herein, shaving waste resulting from metal-free tanning technology is proposed as low-cost precursors for the preparation of carbon material as anodes for lithium-ion batteries (LIBs). In line with the philosophy of a reduced environmental impact, for preparing fully sustainable and environmentally friendly LIBs anodes, deionized water and carboxymethyl cellulose (CMC) have been used as alternatives to toxic/teratogen N-methyl-2- pyrrolidone (NMP) and to biologically hazardous Polyvinylidene fluoride (PVdF), respectively. Furthermore, going towards the reduced cost, we employed water solvent and fluoride-free bio-derived CMC binder (as an alternative to NMP and PVdF, respectively) together with LiFePO₄ (LFP) when a full cell was considered. These actions make closer to the 2030 goal of having green LIBs at 100 $ kW h⁻¹. Besides, the preparation of the water-based electrodes does not need a controlled environment and due to the higher vapour pressure of water in comparison with NMP, the water-based electrode drying is much faster. This aspect determines an important consequence, namely a reduced energy consumption for the electrode preparation. The electrode derived from leather waste demonstrated a discharge capacity of 735 mAh g⁻¹ after 1000 charge and discharge cycles at 0.5 A g⁻¹. This promising performance is ascribed to the synergistic effect of defects, interlayer spacing, heteroatoms-doped (N, O, and S), high specific surface area, and hierarchical micro/mesopore structure of the biochar. Interestingly, these features of activated biochars derived from the leather industry open the way for possible applications in other EESDs as well.Keywords: biowaste, lithium-ion batteries, physical activation, waste management, leather industry
Procedia PDF Downloads 1705 Non Pharmacological Approach to IBS (Irritable Bowel Syndrome)
Authors: A. Aceranti, L. Moretti, S. Vernocchi, M. Colorato, P. Caristia
Abstract:
Irritable bowel syndrome (IBS) is the association between abdominal pain, abdominal distension and intestinal dysfunction for recurring periods. About 10% of the world's population has IBS at any given time in their life, and about 200 people per 100,000 receive an initial diagnosis of IBS each year. Persistent pain is recognized as one of the most pervasive and challenging problems facing the medical community today. Persistent pain is considered more as a complex pathophysiological, diagnostic and therapeutic situation rather than as a persistent symptom. The low efficiency of conventional drug treatments has led many doctors to become interested in the non-drug alternative treatment of IBS, especially for more severe cases. Patients and providers are often dissatisfied with the available drug remedies and often seek complementary and alternative medicine (CAM), a unique and holistic approach to treatment that is not a typical component of conventional medicine. Osteopathic treatment may be of specific interest in patients with IBS. Osteopathy is a complementary health approach that emphasizes the role of the musculoskeletal system in health and promotes optimal function of the body's tissues using a variety of manual techniques to improve body function. Osteopathy has been defined as a patient-centered health discipline based on the principles of interrelation between body structure and function, the body's innate capacity for self-healing and the adoption of a whole person health approach. mainly by practicing manual processing. Studies reported that osteopathic manual treatment (OMT) reduced IBS symptoms, such as abdominal pain, constipation, diarrhea, and improved general well-being. The focus in the treatment of IBS with osteopathy has gone beyond simple spinal alignment, to directly address the abnormal physiology of the body using a series of direct and indirect techniques. The topic of this study was chosen for different reasons: due to the large number of people involved who suffer from this disorder and for the dysfunction itself, since nowadays there is still little clarity about the best type of treatment and, above all, to its origin. The visceral component in the osteopathic field is still a world to be discovered, although it is related to a large part of patient series, it has contents that affect numerous disciplines and this makes it an enigma yet to be solved. The study originated in the didactic practice where the curiosity of a topic is marked that, even today, no one is able to explain and, above all, cure definitively. The main purpose of this study is to try to create a good basis on the osteopathic discipline for subsequent studies that can be exhaustive in the best possible way, resolving some doubts about which treatment modality can be used with more relevance. The path was decided to structure it in such a way that 3 types of osteopathic treatment are used on 3 groups of people who will be selected after completing a questionnaire, which will deem them suitable for the study. They will, in fact, be divided into three groups where: - the first group was given a visceral osteopathic treatment. - The second group was given a manual osteopathic treatment of neurological stimulation. - The third group received a placebo treatment. At the end of the treatment, questionnaires will be re-proposed respectively one week after the session and one month after the treatment from which any data will be collected that will demonstrate the effectiveness or otherwise of the treatment received. The sample of 50 patients examined underwent an oral interview to evaluate the inclusion and exclusion criteria to participate in the study. Of the 50 patients questioned, 17 people who underwent different osteopathic techniques were eligible for the study. Comparing the data related to the first assessment of tenderness and frequency of symptoms with the data related to the first follow-up shows a significant improvement in the score assigned to the different questions, especially in the neurogenic and visceral groups. We are aware of the fact that it is a study performed on a small sample of patients, and this is a penalizing factor. We remain, however, convinced that having obtained good results in terms of subjective improvement in the quality of life of the subjects, it would be very interesting to re-propose the study on a larger sample and fill the gaps.Keywords: IBS, osteopathy, colon, intestinal inflammation
Procedia PDF Downloads 1014 Hybrid GNN Based Machine Learning Forecasting Model For Industrial IoT Applications
Authors: Atish Bagchi, Siva Chandrasekaran
Abstract:
Background: According to World Bank national accounts data, the estimated global manufacturing value-added output in 2020 was 13.74 trillion USD. These manufacturing processes are monitored, modelled, and controlled by advanced, real-time, computer-based systems, e.g., Industrial IoT, PLC, SCADA, etc. These systems measure and manipulate a set of physical variables, e.g., temperature, pressure, etc. Despite the use of IoT, SCADA etc., in manufacturing, studies suggest that unplanned downtime leads to economic losses of approximately 864 billion USD each year. Therefore, real-time, accurate detection, classification and prediction of machine behaviour are needed to minimise financial losses. Although vast literature exists on time-series data processing using machine learning, the challenges faced by the industries that lead to unplanned downtimes are: The current algorithms do not efficiently handle the high-volume streaming data from industrial IoTsensors and were tested on static and simulated datasets. While the existing algorithms can detect significant 'point' outliers, most do not handle contextual outliers (e.g., values within normal range but happening at an unexpected time of day) or subtle changes in machine behaviour. Machines are revamped periodically as part of planned maintenance programmes, which change the assumptions on which original AI models were created and trained. Aim: This research study aims to deliver a Graph Neural Network(GNN)based hybrid forecasting model that interfaces with the real-time machine control systemand can detect, predict machine behaviour and behavioural changes (anomalies) in real-time. This research will help manufacturing industries and utilities, e.g., water, electricity etc., reduce unplanned downtimes and consequential financial losses. Method: The data stored within a process control system, e.g., Industrial-IoT, Data Historian, is generally sampled during data acquisition from the sensor (source) and whenpersistingin the Data Historian to optimise storage and query performance. The sampling may inadvertently discard values that might contain subtle aspects of behavioural changes in machines. This research proposed a hybrid forecasting and classification model which combines the expressive and extrapolation capability of GNN enhanced with the estimates of entropy and spectral changes in the sampled data and additional temporal contexts to reconstruct the likely temporal trajectory of machine behavioural changes. The proposed real-time model belongs to the Deep Learning category of machine learning and interfaces with the sensors directly or through 'Process Data Historian', SCADA etc., to perform forecasting and classification tasks. Results: The model was interfaced with a Data Historianholding time-series data from 4flow sensors within a water treatment plantfor45 days. The recorded sampling interval for a sensor varied from 10 sec to 30 min. Approximately 65% of the available data was used for training the model, 20% for validation, and the rest for testing. The model identified the anomalies within the water treatment plant and predicted the plant's performance. These results were compared with the data reported by the plant SCADA-Historian system and the official data reported by the plant authorities. The model's accuracy was much higher (20%) than that reported by the SCADA-Historian system and matched the validated results declared by the plant auditors. Conclusions: The research demonstrates that a hybrid GNN based approach enhanced with entropy calculation and spectral information can effectively detect and predict a machine's behavioural changes. The model can interface with a plant's 'process control system' in real-time to perform forecasting and classification tasks to aid the asset management engineers to operate their machines more efficiently and reduce unplanned downtimes. A series of trialsare planned for this model in the future in other manufacturing industries.Keywords: GNN, Entropy, anomaly detection, industrial time-series, AI, IoT, Industry 4.0, Machine Learning
Procedia PDF Downloads 1503 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis
Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García
Abstract:
Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis
Procedia PDF Downloads 2262 Enhancing Disaster Resilience: Advanced Natural Hazard Assessment and Monitoring
Authors: Mariza Kaskara, Stella Girtsou, Maria Prodromou, Alexia Tsouni, Christodoulos Mettas, Stavroula Alatza, Kyriaki Fotiou, Marios Tzouvaras, Charalampos Kontoes, Diofantos Hadjimitsis
Abstract:
Natural hazard assessment and monitoring are crucial in managing the risks associated with fires, floods, and geohazards, particularly in regions prone to these natural disasters, such as Greece and Cyprus. Recent advancements in technology, developed by the BEYOND Center of Excellence of the National Observatory of Athens, have been successfully applied in Greece and are now set to be transferred to Cyprus. The implementation of these advanced technologies in Greece has significantly improved the country's ability to respond to these natural hazards. For wildfire risk assessment, a scalar wildfire occurrence risk index is created based on the predictions of machine learning models. Predicting fire danger is crucial for the sustainable management of forest fires as it provides essential information for designing effective prevention measures and facilitating response planning for potential fire incidents. A reliable forecast of fire danger is a key component of integrated forest fire management and is heavily influenced by various factors that affect fire ignition and spread. The fire risk model is validated by the sensitivity and specificity metric. For flood risk assessment, a multi-faceted approach is employed, including the application of remote sensing techniques, the collection and processing of data from the most recent population and building census, technical studies and field visits, as well as hydrological and hydraulic simulations. All input data are used to create precise flood hazard maps according to various flooding scenarios, detailed flood vulnerability and flood exposure maps, which will finally produce the flood risk map. Critical points are identified, and mitigation measures are proposed for the worst-case scenario, namely, refuge areas are defined, and escape routes are designed. Flood risk maps can assist in raising awareness and save lives. Validation is carried out through historical flood events using remote sensing data and records from the civil protection authorities. For geohazards monitoring (e.g., landslides, subsidence), Synthetic Aperture Radar (SAR) and optical satellite imagery are combined with geomorphological and meteorological data and other landslide/ground deformation contributing factors. To monitor critical infrastructures, including dams, advanced InSAR methodologies are used for identifying surface movements through time. Monitoring these hazards provides valuable information for understanding processes and could lead to early warning systems to protect people and infrastructure. Validation is carried out through both geotechnical expert evaluations and visual inspections. The success of these systems in Greece has paved the way for their transfer to Cyprus to enhance Cyprus's capabilities in natural hazard assessment and monitoring. This transfer is being made through capacity building activities, fostering continuous collaboration between Greek and Cypriot experts. Apart from the knowledge transfer, small demonstration actions are implemented to showcase the effectiveness of these technologies in real-world scenarios. In conclusion, the transfer of advanced natural hazard assessment technologies from Greece to Cyprus represents a significant step forward in enhancing the region's resilience to disasters. EXCELSIOR project funds knowledge exchange, demonstration actions and capacity-building activities and is committed to empower Cyprus with the tools and expertise to effectively manage and mitigate the risks associated with these natural hazards. Acknowledgement:Authors acknowledge the 'EXCELSIOR': ERATOSTHENES: Excellence Research Centre for Earth Surveillance and Space-Based Monitoring of the Environment H2020 Widespread Teaming project.Keywords: earth observation, monitoring, natural hazards, remote sensing
Procedia PDF Downloads 381 Impacts of Transformational Leadership: Petronas Stations in Sabah, Malaysia
Authors: Lizinis Cassendra Frederick Dony, Jirom Jeremy Frederick Dony, Cyril Supain Christopher
Abstract:
The purpose of this paper is to improve the devotion to leadership through HR practices implementation at the PETRONAS stations. This emphasize the importance of personal grooming and Customer Care hospitality training for their front line working individuals and teams’ at PETRONAS stations in Sabah. Based on Thomas Edison, International Leadership Journal, theory, research, education and development practice and application to all organizational phenomena may affect or be affected by leadership. FINDINGS – PETRONAS in short called Petroliam Nasional Berhad is a Malaysian oil and gas company that was founded on August 17, 1974. Wholly owned by the Government of Malaysia, the corporation is vested with the entire oil and gas resources in Malaysia and is entrusted with the responsibility of developing and adding value to these resources. Fortune ranks PETRONAS as the 68th largest company in the world in 2012. It also ranks PETRONAS as the 12th most profitable company in the world and the most profitable in Asia. As of the end of March 2005, the PETRONAS Group comprised 103 wholly owned subsidiaries, 19 partly owned outfits and 57 associated companies. The group is engaged in a wide spectrum of petroleum activities, including upstream exploration and production of oil and gas to downstream oil refining, marketing and distribution of petroleum products, trading, gas processing and liquefaction, gas transmission pipeline network operations, marketing of liquefied natural gas; petrochemical manufacturing and marketing; shipping; automotive engineering and property investment. PETRONAS has growing their marketing channel in a competitive market. They have combined their resources to pursue common goals. PETRONAS provides opportunity to carry out Industrial Training Job Placement to the University students in Malaysia for 6-8 months. The effects of the Industrial Training have exposed them to the real working environment experience acting representing on behalf of General Manager for almost one year. Thus, the management education and reward incentives schemes have aspire the working teams transformed to gain their good leadership. Furthermore, knowledge and experiences are very important in the human capital development transformation. SPSS extends the accurate analysis PETRONAS achievement through 280 questionnaires and 81 questionnaires through excel calculation distributed to interview face to face with the customers, PETRONAS dealers and front desk staffs stations in the 17 stations in Kota Kinabalu, Sabah. Hence, this research study will improve its service quality innovation and business sustainability performance optimization. ORIGINALITY / VALUE – The impact of Transformational Leadership practices have influenced the working team’s behaviour as a Brand Ambassadors of PETRONAS. Finally, the findings correlation indicated that PETRONAS stations needs more HR resources practices to deploy more customer care retention resources in mitigating the business challenges in oil and gas industry. Therefore, as the business established at stiff competition globally (Cooper, 2006; Marques and Simon, 2006), it is crucial for the team management should be capable to minimize noises risk, financial risk and mitigating any other risks as a whole at the optimum level. CONCLUSION- As to conclude this research found that both transformational and transactional contingent reward leadership4 were positively correlated with ratings of platoon potency and ratings of leadership for the platoon leader and sergeant were moderately inter correlated. Due to this identification, we recommended that PETRONAS management should offers quality team management in PETRONAS stations in a broader variety of leadership training specialization in the operation efficiency at the front desk Customer Care hospitality. By having the reliability and validity of job experiences, it leverages diversity teamwork and cross collaboration. Other than leveraging factor, PETRONAS also will strengthen the interpersonal front liners effectiveness and enhance quality of interaction through effective communication. Finally, through numerous CSR correlation studies regression PETRONAS performance on Corporate Social Performance and several control variables.1 CSR model activities can be mis-specified if it is not controllable under R & D which evident in various feedbacks collected from the local communities and younger generation is inclined to higher financial expectation from PETRONAS. But, however, it created a huge impact on the nation building as part of its social adaptability overreaching their business stakeholders’ satisfaction in Sabah.Keywords: human resources practices implementation (hrpi), source of competitive advantage in people’s development (socaipd), corporate social responsibility (csr), service quality at front desk stations (sqafd), impacts of petronas leadership (iopl)
Procedia PDF Downloads 349