Search results for: black rice bran extract
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3452

Search results for: black rice bran extract

242 Utilizing Topic Modelling for Assessing Mhealth App’s Risks to Users’ Health before and during the COVID-19 Pandemic

Authors: Pedro Augusto Da Silva E Souza Miranda, Niloofar Jalali, Shweta Mistry

Abstract:

BACKGROUND: Software developers utilize automated solutions to scrape users’ reviews to extract meaningful knowledge to identify problems (e.g., bugs, compatibility issues) and possible enhancements (e.g., users’ requests) to their solutions. However, most of these solutions do not consider the health risk aspects to users. Recent works have shed light on the importance of including health risk considerations in the development cycle of mHealth apps to prevent harm to its users. PROBLEM: The COVID-19 Pandemic in Canada (and World) is currently forcing physical distancing upon the general population. This new lifestyle made the usage of mHealth applications more essential than ever, with a projected market forecast of 332 billion dollars by 2025. However, this new insurgency in mHealth usage comes with possible risks to users’ health due to mHealth apps problems (e.g., wrong insulin dosage indication due to a UI error). OBJECTIVE: These works aim to raise awareness amongst mHealth developers of the importance of considering risks to users’ health within their development lifecycle. Moreover, this work also aims to help mHealth developers with a Proof-of-Concept (POC) solution to understand, process, and identify possible health risks to users of mHealth apps based on users’ reviews. METHODS: We conducted a mixed-method study design. We developed a crawler to mine the negative reviews from two samples of mHealth apps (my fitness, medisafe) from the Google Play store users. For each mHealth app, we performed the following steps: • The reviews are divided into two groups, before starting the COVID-19 (reviews’ submission date before 15 Feb 2019) and during the COVID-19 (reviews’ submission date starts from 16 Feb 2019 till Dec 2020). For each period, the Latent Dirichlet Allocation (LDA) topic model was used to identify the different clusters of reviews based on similar topics of review The topics before and during COVID-19 are compared, and the significant difference in frequency and severity of similar topics are identified. RESULTS: We successfully scraped, filtered, processed, and identified health-related topics in both qualitative and quantitative approaches. The results demonstrated the similarity between topics before and during the COVID-19.

Keywords: natural language processing (NLP), topic modeling, mHealth, COVID-19, software engineering, telemedicine, health risks

Procedia PDF Downloads 121
241 Perspectives and Challenges a Functional Bread With Yeast Extract to Improve Human Diet

Authors: Cláudia Patrocínio, Beatriz Fernandes, Ana Filipa Pires

Abstract:

Background: Mirror therapy (MT) is used to improve motor function after stroke. During MT, a mirror is placed between the two upper limbs (UL), thus reflecting movements of the non- affected side as if it were the affected side. Objectives: The aim of this review is to analyze the evidence on the effec.tiveness of MT in the recovery of UL function in population with post chronic stroke. Methods: The literature search was carried out in PubMed, ISI Web of Science, and PEDro database. Inclusion criteria: a) studies that include individuals diagnosed with stroke for at least 6 months; b) intervention with MT in UL or comparing it with other interventions; c) articles published until 2023; d) articles published in English or Portuguese; e) randomized controlled studies. Exclusion criteria: a) animal studies; b) studies that do not provide a detailed description of the intervention; c) Studies using central electrical stimulation. The methodological quality of the included studies was assessed using the Physiotherapy Evidence Database (PEDro) scale. Studies with < 4 on PEDro scale were excluded. Eighteen studies met all the inclusion criteria. Main results and conclusions: The quality of the studies varies between 5 and 8. One article compared muscular strength training (MST) with MT vs without MT and four articles compared the use of MT vs conventional therapy (CT), one study compared extracorporeal shock therapy (EST) with and without MT and another study compared functional electrical stimulation (FES), MT and biofeedback, three studies compared MT with Mesh Glove (MG) or Sham Therapy, five articles compared performing bimanual exercises with and without MT and three studies compared MT with virtual reality (VR) or robot training (RT). The assessment of changes in function and structure (International Classification of Functioning, Disability and Health parameter) was carried out, in each article, mainly using the Fugl Meyer Assessment-Upper Limb scale, activity and participation (International Classification of Functioning, Disability and Health parameter) were evaluated using different scales, in each study. The positive results were seen in these parameters, globally. Results suggest that MT is more effective than other therapies in motor recovery and function of the affected UL, than these techniques alone, although the results have been modest in most of the included studies. There is also a more significant improvement in the distal movements of the affected hand than in the rest of the UL.

Keywords: physical therapy, mirror therapy, chronic stroke, upper limb, hemiplegia

Procedia PDF Downloads 42
240 Optimization of Perfusion Distribution in Custom Vascular Stent-Grafts Through Patient-Specific CFD Models

Authors: Scott M. Black, Craig Maclean, Pauline Hall Barrientos, Konstantinos Ritos, Asimina Kazakidi

Abstract:

Aortic aneurysms and dissections are leading causes of death in cardiovascular disease. Both inevitably lead to hemodynamic instability without surgical intervention in the form of vascular stent-graft deployment. An accurate description of the aortic geometry and blood flow in patient-specific cases is vital for treatment planning and long-term success of such grafts, as they must generate physiological branch perfusion and in-stent hemodynamics. The aim of this study was to create patient-specific computational fluid dynamics (CFD) models through a multi-modality, multi-dimensional approach with boundary condition optimization to predict branch flow rates and in-stent hemodynamics in custom stent-graft configurations. Three-dimensional (3D) thoracoabdominal aortae were reconstructed from four-dimensional flow-magnetic resonance imaging (4D Flow-MRI) and computed tomography (CT) medical images. The former employed a novel approach to generate and enhance vessel lumen contrast via through-plane velocity at discrete, user defined cardiac time steps post-hoc. To produce patient-specific boundary conditions (BCs), the aortic geometry was reduced to a one-dimensional (1D) model. Thereafter, a zero-dimensional (0D) 3-Element Windkessel model (3EWM) was coupled to each terminal branch to represent the distal vasculature. In this coupled 0D-1D model, the 3EWM parameters were optimized to yield branch flow waveforms which are representative of the 4D Flow-MRI-derived in-vivo data. Thereafter, a 0D-3D CFD model was created, utilizing the optimized 3EWM BCs and a 4D Flow-MRI-obtained inlet velocity profile. A sensitivity analysis on the effects of stent-graft configuration and BC parameters was then undertaken using multiple stent-graft configurations and a range of distal vasculature conditions. 4D Flow-MRI granted unparalleled visualization of blood flow throughout the cardiac cycle in both the pre- and postsurgical states. Segmentation and reconstruction of healthy and stented regions from retrospective 4D Flow-MRI images also generated 3D models with geometries which were successfully validated against their CT-derived counterparts. 0D-1D coupling efficiently captured branch flow and pressure waveforms, while 0D-3D models also enabled 3D flow visualization and quantification of clinically relevant hemodynamic parameters for in-stent thrombosis and graft limb occlusion. It was apparent that changes in 3EWM BC parameters had a pronounced effect on perfusion distribution and near-wall hemodynamics. Results show that the 3EWM parameters could be iteratively changed to simulate a range of graft limb diameters and distal vasculature conditions for a given stent-graft to determine the optimal configuration prior to surgery. To conclude, this study outlined a methodology to aid in the prediction post-surgical branch perfusion and in-stent hemodynamics in patient specific cases for the implementation of custom stent-grafts.

Keywords: 4D flow-MRI, computational fluid dynamics, vascular stent-grafts, windkessel

Procedia PDF Downloads 167
239 Moderate Electric Field Influence on Carotenoids Extraction Time from Heterochlorella luteoviridis

Authors: Débora P. Jaeschke, Eduardo A. Merlo, Rosane Rech, Giovana D. Mercali, Ligia D. F. Marczak

Abstract:

Carotenoids are high value added pigments that can be alternatively extracted from some microalgae species. However, the application of carotenoids synthetized by microalgae is still limited due to the utilization of organic toxic solvents. In this context, studies involving alternative extraction methods have been conducted with more sustainable solvents to replace and reduce the solvent volume and the extraction time. The aim of the present work was to evaluate the extraction time of carotenoids from the microalgae Heterochlorella luteoviridis using moderate electric field (MEF) as a pre-treatment to the extraction. The extraction methodology consisted of a pre-treatment in the presence of MEF (180 V) and ethanol (25 %, v/v) for 10 min, followed by a diffusive step performed for 50 min using a higher ethanol concentration (75 %, v/v). The extraction experiments were conducted at 30 °C and, to keep the temperature at this value, it was used an extraction cell with a water jacket that was connected to a water bath. Also, to enable the evaluation of MEF effect on the extraction, control experiments were performed using the same cell and conditions without voltage application. During the extraction experiments, samples were withdrawn at 1, 5 and 10 min of the pre-treatment and at 1, 5, 30, 40 and 50 min of the diffusive step. Samples were, then, centrifuged and carotenoids analyses were performed in the supernatant. Furthermore, an exhaustive extraction with ethyl acetate and methanol was performed, and the carotenoids content found for this analyses was considered as the total carotenoids content of the microalgae. The results showed that the application of MEF as a pre-treatment to the extraction influenced the extraction yield and the extraction time during the diffusive step; after the MEF pre-treatment and 50 min of the diffusive step, it was possible to extract up to 60 % of the total carotenoids content. Also, results found for carotenoids concentration of the extracts withdrawn at 5 and 30 min of the diffusive step did not presented statistical difference, meaning that carotenoids diffusion occurs mainly in the very beginning of the extraction. On the other hand, the results for control experiments showed that carotenoids diffusion occurs mostly during 30 min of the diffusive step, which evidenced MEF effect on the extraction time. Moreover, carotenoids concentration on samples withdrawn during the pre-treatment (1, 5 and 10 min) were below the quantification limit of the analyses, indicating that the extraction occurred in the diffusive step, when ethanol (75 %, v/v) was added to the medium. It is possible that MEF promoted cell membrane permeabilization and, when ethanol (75 %) was added, carotenoids interacted with the solvent and the diffusion occurred easily. Based on the results, it is possible to infer that MEF promoted the decrease of carotenoids extraction time due to the increasing of the permeability of the cell membrane which facilitates the diffusion from the cell to the medium.

Keywords: moderate electric field (MEF), pigments, microalgae, ethanol

Procedia PDF Downloads 446
238 Analysis of Brownfield Soil Contamination Using Local Government Planning Data

Authors: Emma E. Hellawell, Susan J. Hughes

Abstract:

BBrownfield sites are currently being redeveloped for residential use. Information on soil contamination on these former industrial sites is collected as part of the planning process by the local government. This research project analyses this untapped resource of environmental data, using site investigation data submitted to a local Borough Council, in Surrey, UK. Over 150 site investigation reports were collected and interrogated to extract relevant information. This study involved three phases. Phase 1 was the development of a database for soil contamination information from local government reports. This database contained information on the source, history, and quality of the data together with the chemical information on the soil that was sampled. Phase 2 involved obtaining site investigation reports for development within the study area and extracting the required information for the database. Phase 3 was the data analysis and interpretation of key contaminants to evaluate typical levels of contaminants, their distribution within the study area, and relating these results to current guideline levels of risk for future site users. Preliminary results for a pilot study using a sample of the dataset have been obtained. This pilot study showed there is some inconsistency in the quality of the reports and measured data, and careful interpretation of the data is required. Analysis of the information has found high levels of lead in shallow soil samples, with mean and median levels exceeding the current guidance for residential use. The data also showed elevated (but below guidance) levels of potentially carcinogenic polyaromatic hydrocarbons. Of particular concern from the data was the high detection rate for asbestos fibers. These were found at low concentrations in 25% of the soil samples tested (however, the sample set was small). Contamination levels of the remaining chemicals tested were all below the guidance level for residential site use. These preliminary pilot study results will be expanded, and results for the whole local government area will be presented at the conference. The pilot study has demonstrated the potential for this extensive dataset to provide greater information on local contamination levels. This can help inform regulators and developers and lead to more targeted site investigations, improving risk assessments, and brownfield development.

Keywords: Brownfield development, contaminated land, local government planning data, site investigation

Procedia PDF Downloads 131
237 A Multi-Criteria Decision Making Approach for Disassembly-To-Order Systems under Uncertainty

Authors: Ammar Y. Alqahtani

Abstract:

In order to minimize the negative impact on the environment, it is essential to manage the waste that generated from the premature disposal of end-of-life (EOL) products properly. Consequently, government and international organizations introduced new policies and regulations to minimize the amount of waste being sent to landfills. Moreover, the consumers’ awareness regards environment has forced original equipment manufacturers to consider being more environmentally conscious. Therefore, manufacturers have thought of different ways to deal with waste generated from EOL products viz., remanufacturing, reusing, recycling, or disposing of EOL products. The rate of depletion of virgin natural resources and their dependency on the natural resources can be reduced by manufacturers when EOL products are treated as remanufactured, reused, or recycled, as well as this will cut on the amount of harmful waste sent to landfills. However, disposal of EOL products contributes to the problem and therefore is used as a last option. Number of EOL need to be estimated in order to fulfill the components demand. Then, disassembly process needs to be performed to extract individual components and subassemblies. Smart products, built with sensors embedded and network connectivity to enable the collection and exchange of data, utilize sensors that are implanted into products during production. These sensors are used for remanufacturers to predict an optimal warranty policy and time period that should be offered to customers who purchase remanufactured components and products. Sensor-provided data can help to evaluate the overall condition of a product, as well as the remaining lives of product components, prior to perform a disassembly process. In this paper, a multi-period disassembly-to-order (DTO) model is developed that takes into consideration the different system uncertainties. The DTO model is solved using Nonlinear Programming (NLP) in multiple periods. A DTO system is considered where a variety of EOL products are purchased for disassembly. The model’s main objective is to determine the best combination of EOL products to be purchased from every supplier in each period which maximized the total profit of the system while satisfying the demand. This paper also addressed the impact of sensor embedded products on the cost of warranties. Lastly, this paper presented and analyzed a case study involving various simulation conditions to illustrate the applicability of the model.

Keywords: closed-loop supply chains, environmentally conscious manufacturing, product recovery, reverse logistics

Procedia PDF Downloads 128
236 Weaving Social Development: An Exploratory Study of Adapting Traditional Textiles Using Indigenous Organic Wool for the Modern Interior Textiles Market

Authors: Seema Singh, Puja Anand, Alok Bhasin

Abstract:

The interior design profession aims to create aesthetically pleasing design solutions for human habitats but of late, growing awareness about depleting environmental resources, both tangible and intangible, and damages to the eco-system led to the quest for creating healthy and sustainable interior environments. The paper proposes adapting traditionally produced organic wool textiles for the mainstream interior design industry. This can create sustainable livelihoods whereby eco-friendly bridges can be built between Interior designers and consumers and pastoral communities. This study focuses on traditional textiles produced by two pastoral communities from India that use organic wool from indigenous sheep varieties. The Gaddi communities of Himachal Pradesh use wool from the Gaddi sheep breed to create Pattu (a multi-purpose textile). The Kurumas of Telangana weave a blanket called the Gongadi, using wool from the Black Deccani variety of sheep. These communities have traditionally reared indigenous sheep breeds for their wool and produce hand-spun and hand-woven textiles for their own consumption, using traditional processes that are chemical free. Based on data collected personally from field visits and documentation of traditional crafts of these pastoral communities, and using traditionally produced indigenous organic wool, the authors have developed innovative textile samples by including design interventions and exploring dyeing and weaving techniques. As part of the secondary research, the role of pastoralism in sustaining the eco-systems of Himachal Pradesh and Telangana was studied, and also the role of organic wool in creating healthy interior environments. The authors found that natural wool from indigenous sheep breeds can be used to create interior textiles that have the potential to be marketed to an urban audience, and this will help create earnings for pastoral communities. Literature studies have shown that organic & sustainable wool can reduce indoor pollution & toxicity levels in interiors and further help in creating healthier interior environments. Revival of indigenous breeds of sheep can further help in rejuvenating dying crafts, and promotion of these indigenous textiles can help in sustaining traditional eco-systems and the pastoral communities whose way of life is endangered today. Based on research and findings, the authors propose that adapting traditional textiles can have potential for application in Interiors, creating eco-friendly spaces. Interior textiles produced through such sustainable processes can help reduce indoor pollution, give livelihood opportunities to traditional economies, and leave almost zero carbon foot-print while being in sync with available natural resources, hence ultimately benefiting the society. The win-win situation for all the stakeholders in this eco-friendly model makes it pertinent to re-think how we design lifestyle textiles for interiors. This study illustrates a specific example from the two pastoral communities and can be used as a model that can work equally well in any community, regardless of geography.

Keywords: design intervention, eco- friendly, healthy interiors, indigenous, organic wool, pastoralism, sustainability

Procedia PDF Downloads 152
235 High Altitude Glacier Surface Mapping in Dhauliganga Basin of Himalayan Environment Using Remote Sensing Technique

Authors: Aayushi Pandey, Manoj Kumar Pandey, Ashutosh Tiwari, Kireet Kumar

Abstract:

Glaciers play an important role in climate change and are sensitive phenomena of global climate change scenario. Glaciers in Himalayas are unique as they are predominantly valley type and are located in tropical, high altitude regions. These glaciers are often covered with debris which greatly affects ablation rate of glaciers and work as a sensitive indicator of glacier health. The aim of this study is to map high altitude Glacier surface with a focus on glacial lake and debris estimation using different techniques in Nagling glacier of dhauliganga basin in Himalayan region. Different Image Classification techniques i.e. thresholding on different band ratios and supervised classification using maximum likelihood classifier (MLC) have been used on high resolution sentinel 2A level 1c satellite imagery of 14 October 2017.Here Near Infrared (NIR)/Shortwave Infrared (SWIR) ratio image was used to extract the glaciated classes (Snow, Ice, Ice Mixed Debris) from other non-glaciated terrain classes. SWIR/BLUE Ratio Image was used to map valley rock and Debris while Green/NIR ratio image was found most suitable for mapping Glacial Lake. Accuracy assessment was performed using high resolution (3 meters) Planetscope Imagery using 60 stratified random points. The overall accuracy of MLC was 85 % while the accuracy of Band Ratios was 96.66 %. According to Band Ratio technique total areal extent of glaciated classes (Snow, Ice ,IMD) in Nagling glacier was 10.70 km2 nearly 38.07% of study area comprising of 30.87 % Snow covered area, 3.93% Ice and 3.27 % IMD covered area. Non-glaciated classes (vegetation, glacial lake, debris and valley rock) covered 61.93 % of the total area out of which valley rock is dominant with 33.83% coverage followed by debris covering 27.7 % of the area in nagling glacier. Glacial lake and Debris were accurately mapped using Band ratio technique Hence, Band Ratio approach appears to be useful for the mapping of debris covered glacier in Himalayan Region.

Keywords: band ratio, Dhauliganga basin, glacier mapping, Himalayan region, maximum likelihood classifier (MLC), Sentinel-2 satellite image

Procedia PDF Downloads 216
234 Impact Of Anthropogenic Pressures On The Water Quality Of Hammams In The Municipality Of Dar Bouazza, Morocco

Authors: Nihad Chakri, Btissam El Amrani, Faouzi Berrada, Halima Jounaid, Fouad Amraoui

Abstract:

Public baths or hammams play an essential role in the Moroccan urban and peri-urban fabric, constituting part of the cultural heritage. Urbanization in Morocco has led to a significant increase in the number of these traditional hammams: between 6,000 and 15,000 units (to be updated) operate with a traditional heating system. Numerous studies on energy consumption indicate that a hammam consumes between 60 and 120m3 of water and one to two tons of wood per day. On average, one ton of wood costs 650 Moroccan dirhams (approximately 60 Euros), resulting in a daily fuel cost of around 1300 Moroccan dirhams (about 120 Euros). These high consumptions result in significant environmental nuisances generated by: Wastewater: in the case of hammams located on the outskirts of Casablanca, such as our study area, the Municipality of Dar Bouazza, most of these waters are directly discharged into the receiving environment without prior treatment because they are not connected to the sanitation network. Emissions of black smoke and ashes produced by the often incomplete combustion of wood. Reducing the liquid and gas emissions generated by these hammams thus poses an environmental and sustainable development challenge that needs to be addressed. In this context, we initiated the Eco-hammam project with the objective of implementing innovative and locally adapted solutions to limit the negative impacts of hammams on the environment and reduce water and wood energy consumption. This involves treating and reusing wastewater through a compact system with heat recovery and using alternative energy sources to increase and enhance the energy efficiency of these traditional hammams. To achieve this, on-site surveys of hammams in the Dar Bouazza Municipality and the application of statistical approaches to the results of the physico-chemical and bacteriological characterization of incoming and outgoing water from these units were conducted. This allowed us to establish an environmental diagnosis of these entities. In conclusion, the analysis of well water used by Dar Bouazza's hammams revealed the presence of certain parameters that could be hazardous to public health, such as total germs, total coliforms, sulfite-reducing spores, chromium, nickel, and nitrates. Therefore, this work primarily focuses on prospecting upstream of our study area to verify if other sources of pollution influence the quality of well water.

Keywords: public baths, hammams, cultural heritage, urbanization, water consumption, wood consumption, environmental nuisances, wastewater, environmental challenge, sustainable development, Eco-hammam project, innovative solutions, local adaptation, negative impacts, water conservation, wastewater treatment, heat recovery, alternative energy sources, on-site surveys, Dar Bouazza Municipality, statistical approaches, physico-chemical characterization, bacteriological characterization, environmental diagnosis, well water analysis, public health, pollution sources, well water quality

Procedia PDF Downloads 59
233 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results

Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter

Abstract:

Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.

Keywords: classification, CRISP-DM, machine learning, predictive quality, regression

Procedia PDF Downloads 133
232 Leadership and Corporate Social Responsibility: The Role of Spiritual Intelligence

Authors: Meghan E. Murray, Carri R. Tolmie

Abstract:

This study aims to identify potential factors and widely applicable best practices that can contribute to improving corporate social responsibility (CSR) and corporate performance for firms by exploring the relationship between transformational leadership, spiritual intelligence, and emotional intelligence. Corporate social responsibility is when companies are cognizant of the impact of their actions on the economy, their communities, the environment, and the world as a whole while executing business practices accordingly. The prevalence of CSR has continuously strengthened over the past few years and is now a common practice in the business world, with such efforts coinciding with what stakeholders and the public now expect from corporations. Because of this, it is extremely important to be able to pinpoint factors and best practices that can improve CSR within corporations. One potential factor that may lead to improved CSR is spiritual intelligence (SQ), or the ability to recognize and live with a purpose larger than oneself. Spiritual intelligence is a measurable skill, just like emotional intelligence (EQ), and can be improved through purposeful and targeted coaching. This research project consists of two studies. Study 1 is a case study comparison of a benefit corporation and a non-benefit corporation. This study will examine the role of SQ and EQ as moderators in the relationship between the transformational leadership of employees within each company and the perception of each firm’s CSR and corporate performance. Project methodology includes creating and administering a survey comprised of multiple pre-established scales on transformational leadership, spiritual intelligence, emotional intelligence, CSR, and corporate performance. Multiple regression analysis will be used to extract significant findings from the collected data. Study 2 will dive deeper into spiritual intelligence itself by analyzing pre-existing data and identifying key relationships that may provide value to companies and their stakeholders. This will be done by performing multiple regression analysis on anonymized data provided by Deep Change, a company that has created an advanced, proprietary system to measure spiritual intelligence. Based on the results of both studies, this research aims to uncover best practices, including the unique contribution of spiritual intelligence, that can be utilized by organizations to help enhance their corporate social responsibility. If it is found that high spiritual and emotional intelligence can positively impact CSR effort, then corporations will have a tangible way to enhance their CSR: providing targeted employees with training and coaching to increase their SQ and EQ.

Keywords: corporate social responsibility, CSR, corporate performance, emotional intelligence, EQ, spiritual intelligence, SQ, transformational leadership

Procedia PDF Downloads 115
231 Biflavonoids from Selaginellaceae as Epidermal Growth Factor Receptor Inhibitors and Their Anticancer Properties

Authors: Adebisi Adunola Demehin, Wanlaya Thamnarak, Jaruwan Chatwichien, Chatchakorn Eurtivong, Kiattawee Choowongkomon, Somsak Ruchirawat, Nopporn Thasana

Abstract:

The epidermal growth factor receptor (EGFR) is a transmembrane glycoprotein involved in cellular signalling processes and, its aberrant activity is crucial in the development of many cancers such as lung cancer. Selaginellaceae are fern allies that have long been used in Chinese traditional medicine to treat various cancer types, especially lung cancer. Biflavonoids, the major secondary metabolites in Selaginellaceae, have numerous pharmacological activities, including anti-cancer and anti-inflammatory. For instance, amentoflavone induces a cytotoxic effect in the human NSCLC cell line via the inhibition of PARP-1. However, to the best of our knowledge, there are no studies on biflavonoids as EGFR inhibitors. Thus, this study aims to investigate the EGFR inhibitory activities of biflavonoids isolated from Selaginella siamensis and Selaginella bryopteris. Amentoflavone, tetrahydroamentoflavone, sciadopitysin, robustaflavone, robustaflavone-4-methylether, delicaflavone, and chrysocauloflavone were isolated from the ethyl-acetate extract of the whole plants. The structures were determined using NMR spectroscopy and mass spectrometry. In vitro study was conducted to evaluate their cytotoxicity against A549, HEPG2, and T47D human cancer cell lines using the MTT assay. In addition, a target-based assay was performed to investigate their EGFR inhibitory activity using the kinase inhibition assay. Finally, a molecular docking study was conducted to predict the binding modes of the compounds. Robustaflavone-4-methylether and delicaflavone showed the best cytotoxic activity on all the cell lines with IC50 (µM) values of 18.9 ± 2.1 and 22.7 ± 3.3 on A549, respectively. Of these biflavonoids, delicaflavone showed the most potent EGFR inhibitory activity with an 84% relative inhibition at 0.02 nM using erlotinib as a positive control. Robustaflavone-4-methylether showed a 78% inhibition at 0.15 nM. The docking scores obtained from the molecular docking study correlated with the kinase inhibition assay. Robustaflavone-4-methylether and delicaflavone had a docking score of 72.0 and 86.5, respectively. The inhibitory activity of delicaflavone seemed to be linked with the C2”=C3” and 3-O-4”’ linkage pattern. Thus, this study suggests that the structural features of these compounds could serve as a basis for developing new EGFR-TK inhibitors.

Keywords: anticancer, biflavonoids, EGFR, molecular docking, Selaginellaceae

Procedia PDF Downloads 188
230 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 101
229 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation

Authors: Jonathan Gong

Abstract:

Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.

Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning

Procedia PDF Downloads 118
228 Revealing the Intersections: Theater, Mythology, and Cross-Cultural Psychology in Creative Expression

Authors: Nadia K. Thalji

Abstract:

In the timeless tapestry of human culture, theater, mythology, and psychology intersect to weave narratives that transcend temporal and spatial boundaries. For millennia, actors have stood as guardians of intuitive wisdom, their craft serving as a conduit for the collective unconscious. This paper embarks on a journey through the realms of creative expression, melding the insights of cross-cultural psychology with the mystical allure of serendipity and synchronicity. At the nexus of these disciplines lies the enigmatic process of active imagination, a gateway to the depths of the psyche elucidated by Jung. Within the hallowed confines of the black box theater at the Department of Performing Arts, UFRGS University in Brazil, this study unfolds. Over the span of four months, a cadre of artists embarked on a voyage of exploration, harnessing the powers of imagery, movement, sound, and dreams to birth a performance that resonated with the echoes of ancient wisdom. Drawing inspiration from the fabled Oracle of Delphi and the priestesses who once dwelled within its sacred precincts, the production delves into the liminal spaces where myth and history intertwine. Through the alchemy of storytelling, participants navigate the labyrinthine corridors of cultural memory, unraveling the threads that bind the past to the present. Central to this endeavor is the phenomenon of synchronicity, wherein seemingly disparate elements coalesce in a dance of cosmic resonance. Serendipity becomes a guiding force, leading actors and audience alike along unexpected pathways of discovery. As the boundaries between performer and spectator blur, the performance becomes a crucible wherein individual narratives merge to form a collective tapestry of shared experience. Yet, beneath the surface of spectacle lies a deeper truth: the exploration of the spiritual dimensions of artistic expression. Through intuitive inquiry and embodied practice, artists tap into reservoirs of insight that transcend rational comprehension. In the communion of minds and bodies, the stage becomes a sacred space wherein the numinous unfolds in all its ineffable glory. In essence, this paper serves as a testament to the transformative power of the creative act. Across cultures and epochs, the theater has served as a crucible wherein humanity grapples with the mysteries of existence. Through the lens of cross-cultural psychology, we glimpse the universal truths that underlie the myriad manifestations of human creativity. As we navigate the turbulent currents of modernity, the wisdom of the ancients beckons us to heed the call of the collective unconscious. In the synthesis of myth and meaning, we find solace amidst the chaos, forging connections that transcend the boundaries of time and space. And in the sacred precincts of the theater, we discover the eternal truth that art is, and always shall be, the soul's journey into the unknown.

Keywords: theater, mythology, cross-cultural, synchronicity, creativity, serendipity, spiritual

Procedia PDF Downloads 51
227 Using Hemicellulosic Liquor from Sugarcane Bagasse to Produce Second Generation Lactic Acid

Authors: Regiane A. Oliveira, Carlos E. Vaz Rossell, Rubens Maciel Filho

Abstract:

Lactic acid, besides a valuable chemical may be considered a platform for other chemicals. In fact, the feasibility of hemicellulosic sugars as feedstock for lactic acid production process, may represent the drop of some of the barriers for the second generation bioproducts, especially bearing in mind the 5-carbon sugars from the pre-treatment of sugarcane bagasse. Bearing this in mind, the purpose of this study was to use the hemicellulosic liquor from sugarcane bagasse as a substrate to produce lactic acid by fermentation. To release of sugars from hemicellulose it was made a pre-treatment with a diluted sulfuric acid in order to obtain a xylose's rich liquor with low concentration of inhibiting compounds for fermentation (≈ 67% of xylose, ≈ 21% of glucose, ≈ 10% of cellobiose and arabinose, and around 1% of inhibiting compounds as furfural, hydroxymethilfurfural and acetic acid). The hemicellulosic sugars associated with 20 g/L of yeast extract were used in a fermentation process with Lactobacillus plantarum to produce lactic acid. The fermentation process pH was controlled with automatic injection of Ca(OH)2 to keep pH at 6.00. The lactic acid concentration remained stable from the time when the glucose was depleted (48 hours of fermentation), with no further production. While lactic acid is produced occurs the concomitant consumption of xylose and glucose. The yield of fermentation was 0.933 g lactic acid /g sugars. Besides, it was not detected the presence of by-products, what allows considering that the microorganism uses a homolactic fermentation to produce its own energy using pentose-phosphate pathway. Through facultative heterofermentative metabolism the bacteria consume pentose, as is the case of L. plantarum, but the energy efficiency for the cell is lower than during the hexose consumption. This implies both in a slower cell growth, as in a reduction in lactic acid productivity compared with the use of hexose. Also, L. plantarum had shown to have a capacity for lactic acid production from hemicellulosic hydrolysate without detoxification, which is very attractive in terms of robustness for an industrial process. Xylose from hydrolyzed bagasse and without detoxification is consumed, although the hydrolyzed bagasse inhibitors (especially aromatic inhibitors) affect productivity and yield of lactic acid. The use of sugars and the lack of need for detoxification of the C5 liquor from sugarcane bagasse hydrolyzed is a crucial factor for the economic viability of second generation processes. Taking this information into account, the production of second generation lactic acid using sugars from hemicellulose appears to be a good alternative to the complete utilization of sugarcane plant, directing molasses and cellulosic carbohydrates to produce 2G-ethanol, and hemicellulosic carbohydrates to produce 2G-lactic acid.

Keywords: fermentation, lactic acid, hemicellulosic sugars, sugarcane

Procedia PDF Downloads 361
226 Unlocking Health Insights: Studying Data for Better Care

Authors: Valentina Marutyan

Abstract:

Healthcare data mining is a rapidly developing field at the intersection of technology and medicine that has the potential to change our understanding and approach to providing healthcare. Healthcare and data mining is the process of examining huge amounts of data to extract useful information that can be applied in order to improve patient care, treatment effectiveness, and overall healthcare delivery. This field looks for patterns, trends, and correlations in a variety of healthcare datasets, such as electronic health records (EHRs), medical imaging, patient demographics, and treatment histories. To accomplish this, it uses advanced analytical approaches. Predictive analysis using historical patient data is a major area of interest in healthcare data mining. This enables doctors to get involved early to prevent problems or improve results for patients. It also assists in early disease detection and customized treatment planning for every person. Doctors can customize a patient's care by looking at their medical history, genetic profile, current and previous therapies. In this way, treatments can be more effective and have fewer negative consequences. Moreover, helping patients, it improves the efficiency of hospitals. It helps them determine the number of beds or doctors they require in regard to the number of patients they expect. In this project are used models like logistic regression, random forests, and neural networks for predicting diseases and analyzing medical images. Patients were helped by algorithms such as k-means, and connections between treatments and patient responses were identified by association rule mining. Time series techniques helped in resource management by predicting patient admissions. These methods improved healthcare decision-making and personalized treatment. Also, healthcare data mining must deal with difficulties such as bad data quality, privacy challenges, managing large and complicated datasets, ensuring the reliability of models, managing biases, limited data sharing, and regulatory compliance. Finally, secret code of data mining in healthcare helps medical professionals and hospitals make better decisions, treat patients more efficiently, and work more efficiently. It ultimately comes down to using data to improve treatment, make better choices, and simplify hospital operations for all patients.

Keywords: data mining, healthcare, big data, large amounts of data

Procedia PDF Downloads 59
225 Analyzing the Commentator Network Within the French YouTube Environment

Authors: Kurt Maxwell Kusterer, Sylvain Mignot, Annick Vignes

Abstract:

To our best knowledge YouTube is the largest video hosting platform in the world. A high number of creators, viewers, subscribers and commentators act in this specific eco-system which generates huge sums of money. Views, subscribers, and comments help to increase the popularity of content creators. The most popular creators are sponsored by brands and participate in marketing campaigns. For a few of them, this becomes a financially rewarding profession. This is made possible through the YouTube Partner Program, which shares revenue among creators based on their popularity. We believe that the role of comments in increasing the popularity is to be emphasized. In what follows, YouTube is considered as a bilateral network between the videos and the commentators. Analyzing a detailed data set focused on French YouTubers, we consider each comment as a link between a commentator and a video. Our research question asks what are the predominant features of a video which give it the highest probability to be commented on. Following on from this question, how can we use these features to predict the action of the agent in commenting one video instead of another, considering the characteristics of the commentators, videos, topics, channels, and recommendations. We expect to see that the videos of more popular channels generate higher viewer engagement and thus are more frequently commented. The interest lies in discovering features which have not classically been considered as markers for popularity on the platform. A quick view of our data set shows that 96% of the commentators comment only once on a certain video. Thus, we study a non-weighted bipartite network between commentators and videos built on the sub-sample of 96% of unique comments. A link exists between two nodes when a commentator makes a comment on a video. We run an Exponential Random Graph Model (ERGM) approach to evaluate which characteristics influence the probability of commenting a video. The creation of a link will be explained in terms of common video features, such as duration, quality, number of likes, number of views, etc. Our data is relevant for the period of 2020-2021 and focuses on the French YouTube environment. From this set of 391 588 videos, we extract the channels which can be monetized according to YouTube regulations (channels with at least 1000 subscribers and more than 4000 hours of viewing time during the last twelve months).In the end, we have a data set of 128 462 videos which consist of 4093 channels. Based on these videos, we have a data set of 1 032 771 unique commentators, with a mean of 2 comments per a commentator, a minimum of 1 comment each, and a maximum of 584 comments.

Keywords: YouTube, social networks, economics, consumer behaviour

Procedia PDF Downloads 59
224 A Prospective Neurosurgical Registry Evaluating the Clinical Care of Traumatic Brain Injury Patients Presenting to Mulago National Referral Hospital in Uganda

Authors: Benjamin J. Kuo, Silvia D. Vaca, Joao Ricardo Nickenig Vissoci, Catherine A. Staton, Linda Xu, Michael Muhumuza, Hussein Ssenyonjo, John Mukasa, Joel Kiryabwire, Lydia Nanjula, Christine Muhumuza, Henry E. Rice, Gerald A. Grant, Michael M. Haglund

Abstract:

Background: Traumatic Brain Injury (TBI) is disproportionally concentrated in low- and middle-income countries (LMICs), with the odds of dying from TBI in Uganda more than 4 times higher than in high income countries (HICs). The disparities in the injury incidence and outcome between LMICs and resource-rich settings have led to increased health outcomes research for TBIs and their associated risk factors in LMICs. While there have been increasing TBI studies in LMICs over the last decade, there is still a need for more robust prospective registries. In Uganda, a trauma registry implemented in 2004 at the Mulago National Referral Hospital (MNRH) showed that RTI is the major contributor (60%) of overall mortality in the casualty department. While the prior registry provides information on injury incidence and burden, it’s limited in scope and doesn’t follow patients longitudinally throughout their hospital stay nor does it focus specifically on TBIs. And although these retrospective analyses are helpful for benchmarking TBI outcomes, they make it hard to identify specific quality improvement initiatives. The relationship among epidemiology, patient risk factors, clinical care, and TBI outcomes are still relatively unknown at MNRH. Objective: The objectives of this study are to describe the processes of care and determine risk factors predictive of poor outcomes for TBI patients presenting to a single tertiary hospital in Uganda. Methods: Prospective data were collected for 563 TBI patients presenting to a tertiary hospital in Kampala from 1 June – 30 November 2016. Research Electronic Data Capture (REDCap) was used to systematically collect variables spanning 8 categories. Univariate and multivariate analysis were conducted to determine significant predictors of mortality. Results: 563 TBI patients were enrolled from 1 June – 30 November 2016. 102 patients (18%) received surgery, 29 patients (5.1%) intended for surgery failed to receive it, and 251 patients (45%) received non-operative management. Overall mortality was 9.6%, which ranged from 4.7% for mild and moderate TBI to 55% for severe TBI patients with GCS 3-5. Within each TBI severity category, mortality differed by management pathway. Variables predictive of mortality were TBI severity, more than one intracranial bleed, failure to receive surgery, high dependency unit admission, ventilator support outside of surgery, and hospital arrival delayed by more than 4 hours. Conclusions: The overall mortality rate of 9.6% in Uganda for TBI is high, and likely underestimates the true TBI mortality. Furthermore, the wide-ranging mortality (3-82%), high ICU fatality, and negative impact of care delays suggest shortcomings with the current triaging practices. Lack of surgical intervention when needed was highly predictive of mortality in TBI patients. Further research into the determinants of surgical interventions, quality of step-up care, and prolonged care delays are needed to better understand the complex interplay of variables that affect patient outcome. These insights guide the development of future interventions and resource allocation to improve patient outcomes.

Keywords: care continuum, global neurosurgery, Kampala Uganda, LMIC, Mulago, prospective registry, traumatic brain injury

Procedia PDF Downloads 219
223 The Lifecycle of a Heritage Language: A Comparative Case Study of Volga German Descendants in North America

Authors: Ashleigh Dawn Moeller

Abstract:

This is a comparative case study which examines the language attitudes and behaviors of descendants of Volga German immigrants in North America and how these attitudes combined with surrounding social conditions have caused their heritage language to develop differently within each community. Of particular interest for this study are the accounts of second- and third-generation descendants in Oregon, Kansas, and North Dakota regarding their parents’ and grandparents’ attitudes toward their language and how this correlates with the current sentiment as well as visibility of their heritage language and culture. This study discusses the point at which cultural identity could diverge from language identity and what elements play a role in this development, establishing the potential for environments (linguistic landscapes) which uphold their heritage yet have detached from the language itself. Emigrating from Germany in the 1700s, these families settled for over a hundred years along the Volga Region of Imperial Russia. Subsequently, many descendants of these settlers immigrated to the Americas in the 1800-1900s. Identifying neither as German nor Russian, they called themselves Wolgadeutche (Volga Germans). During their time in Russia, the German language was maintained relatively homogenously, yet the use and status of their heritage language diverged considerably upon settlement across the Americas. Data shows that specific conditions, such as community isolation, size, religion, location as well as language policy established prior to and following the Volga German immigration to North America have had a substantial impact on the maintenance of their heritage language—causing complete loss in some areas and peripheral use or even full rebirth in others. These past conditions combined with the family accounts correlate directly with the general attitudes and ideologies of the descendants toward their heritage language. Data also shows that in many locations, despite a strong presence of German within the linguistic landscape, minimal to no German is spoken nor understood; the attitude toward the language is indifferent while a staunch holding to the heritage is maintained and boasted. Data for this study was gathered from historical accounts, archived records and newspapers, and published biographies as well as from formal interviews with second- and third-generation descendants of Volga German immigrants conducted in Oregon and Kansas. Through the interviews, members of the community have shared and provided their family genealogies as well as biographies published by family members. These have helped to trace their relatives back to specific locations, thus allowing for comparisons within the same families residing in distinctly different areas of North America. This study is part of a larger ongoing project which researches the immigration of Volga and Black Sea Germans to North America and diachronically examines the over-arching sociological factors which have directly impacted the maintenance, loss, or rebirth of their heritage language. This project follows specific families who settled in areas of Colorado, Kansas, Nebraska, Illinois, Minnesota, North and South Dakota, Saskatchewan, and Manitoba, and who later had relatives move west to areas of Oregon and Washington State. Interviews for the larger project will continue into the following year.

Keywords: heritage language, immigrant language, language change, language contact, linguistic landscape, Volga Germans, Wolgadeutsche

Procedia PDF Downloads 113
222 From Biowaste to Biobased Products: Life Cycle Assessment of VALUEWASTE Solution

Authors: Andrés Lara Guillén, José M. Soriano Disla, Gemma Castejón Martínez, David Fernández-Gutiérrez

Abstract:

The worldwide population is exponentially increasing, which causes a rising demand for food, energy and non-renewable resources. These demands must be attended to from a circular economy point of view. Under this approach, the obtention of strategic products from biowaste is crucial for the society to keep the current lifestyle reducing the environmental and social issues linked to the lineal economy. This is the main objective of the VALUEWASTE project. VALUEWASTE is about valorizing urban biowaste into proteins for food and feed and biofertilizers, closing the loop of this waste stream. In order to achieve this objective, the project validates three value chains, which begin with the anaerobic digestion of the biowaste. From the anaerobic digestion, three by-products are obtained: i) methane that is used by microorganisms, which will be transformed into microbial proteins; ii) digestate that is used by black soldier fly, producing insect proteins; and iii) a nutrient-rich effluent, which will be transformed into biofertilizers. VALUEWASTE is an innovative solution, which combines different technologies to valorize entirely the biowaste. However, it is also required to demonstrate that the solution is greener than other traditional technologies (baseline systems). On one hand, the proteins from microorganisms and insects will be compared with other reference protein production systems (gluten, whey and soybean). On the other hand, the biofertilizers will be compared to the production of mineral fertilizers (ammonium sulphate and synthetic struvite). Therefore, the aim of this study is to provide that biowaste valorization can reduce the environmental impacts linked to both traditional proteins manufacturing processes and mineral fertilizers, not only at a pilot-scale but also at an industrial one. In the present study, both baseline system and VALUEWASTE solution are evaluated through the Environmental Life Cycle Assessment (E-LCA). The E-LCA is based on the standards ISO 14040 and 14044. The Environmental Footprint methodology was the one used in this study to evaluate the environmental impacts. The results for the baseline cases show that the food proteins coming from whey have the highest environmental impact on ecosystems compared to the other proteins sources: 7.5 and 15.9 folds higher than soybean and gluten, respectively. Comparing feed soybean and gluten, soybean has an environmental impact on human health 195.1 folds higher. In the case of biofertilizers, synthetic struvite has higher impacts than ammonium sulfate: 15.3 (ecosystems) and 11.8 (human health) fold, respectively. The results shown in the present study will be used as a reference to demonstrate the better environmental performance of the bio-based products obtained through the VALUEWASTE solution. Other originalities that the E-LCA performed in the VALUEWASTE project provides are the diverse direct implications on investment and policies. On one hand, better environmental performance will serve to remove the barriers linked to these kinds of technologies, boosting the investment that is backed by the E-LCA. On the other hand, it will be a germ to design new policies fostering these types of solutions to achieve two of the key targets of the European Community: being self-sustainable and carbon neutral.

Keywords: anaerobic digestion, biofertilizers, circular economy, nutrients recovery

Procedia PDF Downloads 83
221 Sensitivity Analysis of the Heat Exchanger Design in Net Power Oxy-Combustion Cycle for Carbon Capture

Authors: Hirbod Varasteh, Hamidreza Gohari Darabkhani

Abstract:

The global warming and its impact on climate change is one of main challenges for current century. Global warming is mainly due to the emission of greenhouse gases (GHG) and carbon dioxide (CO2) is known to be the major contributor to the GHG emission profile. Whilst the energy sector is the primary source for CO2 emission, Carbon Capture and Storage (CCS) are believed to be the solution for controlling this emission. Oxyfuel combustion (Oxy-combustion) is one of the major technologies for capturing CO2 from power plants. For gas turbines, several Oxy-combustion power cycles (Oxyturbine cycles) have been investigated by means of thermodynamic analysis. NetPower cycle is one of the leading oxyturbine power cycles with almost full carbon capture capability from a natural gas fired power plant. In this manuscript, sensitivity analysis of the heat exchanger design in NetPower cycle is completed by means of process modelling. The heat capacity variation and supercritical CO2 with gaseous admixtures are considered for multi-zone analysis with Aspen Plus software. It is found that the heat exchanger design has a major role to increase the efficiency of NetPower cycle. The pinch-point analysis is done to extract the composite and grand composite curve for the heat exchanger. In this paper, relationship between the cycle efficiency and the minimum approach temperature (∆Tmin) of the heat exchanger has also been evaluated.  Increase in ∆Tmin causes a decrease in the temperature of the recycle flue gases (RFG) and an overall decrease in the required power for the recycled gas compressor. The main challenge in the design of heat exchangers in power plants is a tradeoff between the capital and operational costs. To achieve lower ∆Tmin, larger size of heat exchanger is required. This means a higher capital cost but leading to a better heat recovery and lower operational cost. To achieve this, ∆Tmin is selected from the minimum point in the diagrams of capital and operational costs. This study provides an insight into the NetPower Oxy-combustion cycle’s performance analysis and operational condition based on its heat exchanger design.

Keywords: carbon capture and storage, oxy-combustion, netpower cycle, oxy turbine cycles, zero emission, heat exchanger design, supercritical carbon dioxide, oxy-fuel power plant, pinch point analysis

Procedia PDF Downloads 195
220 Conjugated Linoleic Acid Effect on Body Weight and Body Composition in Women: Systematic Review and Meta-Analysis

Authors: Hanady Hamdallah, H. Elyse Ireland, John H. H. Williams

Abstract:

Conjugated linoleic acid (CLA) is a food supplement that is reported to have multiple beneficial health effects, including being anti-carcinogenic, anti-inflammatory and anti-obesity. Animal studies have shown a significant anti-obesity effect of CLA, but results in humans were inconsistent, where some of the studies found an anti-obesity effect while other studies failed to find any decline in obesity markers after CLA supplementation. This meta-analysis aimed to determine if oral CLA supplementation has been shown to reduce obesity related markers in women. Pub Med, Cochrane Library, and Google Scholar were used to identify the eligible trials using two main searching strategies: the first one was to search eligible trials using keywords 'Conjugated linoleic acid', 'CLA', 'Women', and the second strategy was to extract the eligible trials from previously published systematic reviews and meta-analyses. The eligible trials were placebo control trials where women supplemented with CLA mixture in the form of oral capsules for 6 months or less. Also, these trials provided information about body composition expressed as body weight (BW), body mass index (BMI), total body fat (TBF), percentage body fat (BF %), and/ or lean body mass (LBM). The quality of each included study was assessed using both JADAD scale and an adapted CONSERT checklist. Meta-analysis of 8 eligible trials showed that CLA supplementation was significantly associated with reduced BW (Mean ± SD, 1.2 ± 0.26 kg, p < 0.001), BMI (0.6 ± 0.13kg/m², p < 0.001) and TBF (0.76 ± 0.26 kg, p= 0.003) in women, when supplemented over 6-16 weeks. Subgroup meta-analysis demonstrated a significant reduction in BW (1.29 ± 0.31 kg, p < 0.001), BMI (0.60 ± 0.14 kg/m², p < 0.001) and TBF (0.82 ± 0.28 kg, p= 0.003) in the trials that had recruited overweight-obese women. The second subgroup meta-analysis, that considered the menopausal status of the participants, found that CLA was significantly associated with reduced BW (1.35 ± 0.37 kg, p < 0.001; 1.05 ± 0.36 kg, p= 0.003) and BMI (0.50 ± 0.17 kg/m², p= 0.003; 0.75 ± 0.2 kg/m², p < 0.001) in both pre and post-menopausal age women, respectively. A reduction in TBF (1.09 ± 0.37 kg, p= 0.003) was only significant in post-menopausal women. Interestingly, CLA supplementation was associated with a significant reduction in BW (1.05 ± 0.35 kg, p< 0.003), BMI (0.73 ± 0.2 kg/m², p < 0.001) and TBF (1.07 ± 0.36 kg, p= 0.003) in the trials without lifestyle monitoring or interventions. No significant effect of CLA on LBM was detected in this meta-analysis. This meta-analysis suggests a moderate anti-obesity effect of CLA on BW, BMI and TBF reduction in women, when supplemented over 6-16 weeks, particularly in overweight-obese women and post-menopausal women. However, this finding requires careful interpretation due to several issues in the designs of available CLA supplementation trials. More well-designed trials are required to confirm this meta-analysis results.

Keywords: body composition, body mass index, body weight, conjugated linoleic acid

Procedia PDF Downloads 278
219 Carlos Guillermo 'Cubena' Wilson's Literary Texts as Platforms for Social Commentary and Critique of Panamanian Society

Authors: Laverne Seales

Abstract:

When most people think of Panama, they immediately think of the Canal; however, the construction and the people who made it possible are often omitted and seldom acknowledged. The reality is that the construction of this waterway was achieved through forced migration and discriminatory practices toward people of African descent, specifically black people from the Caribbean. From the colonial period to the opening and subsequent operation of the Panama Canal by the United States, this paper goes through the rich layers of Panamanian history to examine the life of Afro-Caribbeans and their descendants in Panama. It also considers the role of the United States in Panama; it explores how the United States in Panama forged a racially complex country that made the integration of Afro-Caribbeans and their descendants difficult. After laying a historical foundation, the exploration of Afro-Caribbean people and Panamanians of Afro-Caribbean descent are analyzed through Afro-Panamanian writer Carlos Guillermo ‘Cubena’ Wilson's novels, short stories, and poetry. This study focuses on how Cubena addresses racism, discrimination, inequality, and social justice issues towards Afro-Caribbeans and their descendants who traveled to Panama to construct the Canal. Content analysis methodology can yield several significant contributions, and analyzing Carlos Guillermo Wilson's literature under this framework allows us to consider social commentary and critique of Panamanian society. It identifies the social issues and concerns of Afro-Caribbeans and people of Afro-Caribbean descent, such as inequality, corruption, racism, political oppression, and cultural identity. Analysis methodology allows us to explore how Cubena's literature engages with questions of cultural identity and belonging in Panamanian society. By examining themes related to race, ethnicity, language, and heritage, this research uncovers the complexities of Panamanian cultural identity, allowing us to interrogate power dynamics and social hierarchies in Panamanian society. Analyzing the portrayal of different social groups, institutions, and power structures helps uncover how power is wielded, contested, and resisted; Cubena's fictional world allows us to see how it functions in Panama. Content analysis methodology also provides for critiquing political systems and governance in Panama. By examining the representation and presentation of political figures, institutions, and events in Cubena's literature, we uncover his commentary on corruption, authoritarianism, governance, and the role of the United States in Panama. Content analysis highlights how Wilson's literature amplifies the voices and experiences of marginalized individuals and communities in Panamanian society. By centering the narratives of Afro-Panamanians and other marginalized groups, this researcher uncovers Cubena's commitment to social justice and inclusion in his writing and helps the reader engage with historical narratives and collective memory in Panama. Overall, analyzing Carlos Guillermo ‘Cubena’ Wilson's literature as a platform for social commentary and critique of Panamanian society using content analysis methodology provides valuable insights into the cultural, social, and political dimensions of Afro-Panamanians during and after the construction of the Panama Canal.

Keywords: Afro-Caribbean, Panama Canal, race, Afro-Panamanian, identity, history

Procedia PDF Downloads 27
218 Design and Test a Robust Bearing-Only Target Motion Analysis Algorithm Based on Modified Gain Extended Kalman Filter

Authors: Mohammad Tarek Al Muallim, Ozhan Duzenli, Ceyhun Ilguy

Abstract:

Passive sonar is a method for detecting acoustic signals in the ocean. It detects the acoustic signals emanating from external sources. With passive sonar, we can determine the bearing of the target only, no information about the range of the target. Target Motion Analysis (TMA) is a process to estimate the position and speed of a target using passive sonar information. Since bearing is the only available information, the TMA technique called Bearing-only TMA. Many TMA techniques have been developed. However, until now, there is not a very effective method that could be used to always track an unknown target and extract its moving trace. In this work, a design of effective Bearing-only TMA Algorithm is done. The measured bearing angles are very noisy. Moreover, for multi-beam sonar, the measurements is quantized due to the sonar beam width. To deal with this, modified gain extended Kalman filter algorithm is used. The algorithm is fine-tuned, and many modules are added to improve the performance. A special validation gate module is used to insure stability of the algorithm. Many indicators of the performance and confidence level measurement are designed and tested. A new method to detect if the target is maneuvering is proposed. Moreover, a reactive optimal observer maneuver based on bearing measurements is proposed, which insure converging to the right solution all of the times. To test the performance of the proposed TMA algorithm a simulation is done with a MATLAB program. The simulator program tries to model a discrete scenario for an observer and a target. The simulator takes into consideration all the practical aspects of the problem such as a smooth transition in the speed, a circular turn of the ship, noisy measurements, and a quantized bearing measurement come for multi-beam sonar. The tests are done for a lot of given test scenarios. For all the tests, full tracking is achieved within 10 minutes with very little error. The range estimation error was less than 5%, speed error less than 5% and heading error less than 2 degree. For the online performance estimator, it is mostly aligned with the real performance. The range estimation confidence level gives a value equal to 90% when the range error less than 10%. The experiments show that the proposed TMA algorithm is very robust and has low estimation error. However, the converging time of the algorithm is needed to be improved.

Keywords: target motion analysis, Kalman filter, passive sonar, bearing-only tracking

Procedia PDF Downloads 386
217 The Extraction of Sage Essential Oil and the Improvement of Sleeping Quality for Female Menopause by Sage Essential Oil

Authors: Bei Shan Lin, Tzu Yu Huang, Ya Ping Chen, Chun Mel Lu

Abstract:

This research is divided into two parts. The first part is to adopt the method of supercritical carbon dioxide fluid extraction to extract sage essential oil (Salvia officinalis) and to find out the differences when the procedure is under different pressure conditions. Meanwhile, this research is going to probe into the composition of the extracted sage essential oil. The second part will talk about the effect of the aromatherapy with extracted sage essential oil to improve the sleeping quality for women in menopause. The extracted sage substance is tested by inhibiting DPPH radical to identify its antioxidant capacity, and the extracted component was analyzed by gas chromatography-mass spectrometer. Under two different pressure conditions, the extracted experiment gets different results. By 3000 psi, the extracted substance is IC50 180.94mg/L, which is higher than IC50 657.43mg/L by 1800 psi. By 3000 psi, the extracted yield is 1.05%, which is higher than 0.68% by 1800 psi. Through the experimental data, the researcher also can conclude that the extracted substance with 3000psi contains more materials than the one with 1800 psi. The main overlapped materials are the compounds of cyclic ether, flavonoid, and terpenes. Cyclic ether and flavonoids have the function of soothing and calming. They can be applied to relieve cramps and to eliminate menopause disorders. The second part of the research is to apply extracted sage essential oil to aromatherapy for women who are in menopause and to discuss the effect of the improvement for the sleeping quality. This research adopts the approaching of Swedish upper back massage, evaluates the sleeping quality with the Pittsburgh Sleep Quality Index, and detects the changes with heart rate variability apparatus. The experimental group intervenes with extracted sage essential oil to the aromatherapy. The average heart beats detected by the apparatus has a better result in SDNN, low frequency, and high frequency. The performance is better than the control group. According to the statistical analysis of the Pittsburgh Sleep Quality Index, this research has reached the effect of sleep quality improvement. It proves that extracted sage essential oil has a significant effect on increasing the activities of parasympathetic nerves. It is able to improve the sleeping quality for women in menopause

Keywords: supercritical carbon dioxide fluid extraction, Salvia officinalis, aromatherapy, Swedish massage, Pittsburgh sleep quality index, heart rate variability, parasympathetic nerves

Procedia PDF Downloads 109
216 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones

Authors: Mohamed Abdelkareem

Abstract:

Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.

Keywords: GIS, remote sensing, groundwater, Egypt

Procedia PDF Downloads 87
215 Satellite Interferometric Investigations of Subsidence Events Associated with Groundwater Extraction in Sao Paulo, Brazil

Authors: B. Mendonça, D. Sandwell

Abstract:

The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.

Keywords: ground subsidence, Interferometric Satellite Aperture Radar (InSAR), metropolitan region of Sao Paulo, water extraction

Procedia PDF Downloads 343
214 Clustering Ethno-Informatics of Naming Village in Java Island Using Data Mining

Authors: Atje Setiawan Abdullah, Budi Nurani Ruchjana, I. Gede Nyoman Mindra Jaya, Eddy Hermawan

Abstract:

Ethnoscience is used to see the culture with a scientific perspective, which may help to understand how people develop various forms of knowledge and belief, initially focusing on the ecology and history of the contributions that have been there. One of the areas studied in ethnoscience is etno-informatics, is the application of informatics in the culture. In this study the science of informatics used is data mining, a process to automatically extract knowledge from large databases, to obtain interesting patterns in order to obtain a knowledge. While the application of culture described by naming database village on the island of Java were obtained from Geographic Indonesia Information Agency (BIG), 2014. The purpose of this study is; first, to classify the naming of the village on the island of Java based on the structure of the word naming the village, including the prefix of the word, syllable contained, and complete word. Second to classify the meaning of naming the village based on specific categories, as well as its role in the community behavioral characteristics. Third, how to visualize the naming of the village to a map location, to see the similarity of naming villages in each province. In this research we have developed two theorems, i.e theorems area as a result of research studies have collected intersection naming villages in each province on the island of Java, and the composition of the wedge theorem sets the provinces in Java is used to view the peculiarities of a location study. The methodology in this study base on the method of Knowledge Discovery in Database (KDD) on data mining, the process includes preprocessing, data mining and post processing. The results showed that the Java community prioritizes merit in running his life, always working hard to achieve a more prosperous life, and love as well as water and environmental sustainment. Naming villages in each location adjacent province has a high degree of similarity, and influence each other. Cultural similarities in the province of Central Java, East Java and West Java-Banten have a high similarity, whereas in Jakarta-Yogyakarta has a low similarity. This research resulted in the cultural character of communities within the meaning of the naming of the village on the island of Java, this character is expected to serve as a guide in the behavior of people's daily life on the island of Java.

Keywords: ethnoscience, ethno-informatics, data mining, clustering, Java island culture

Procedia PDF Downloads 268
213 An EEG-Based Scale for Comatose Patients' Vigilance State

Authors: Bechir Hbibi, Lamine Mili

Abstract:

Understanding the condition of comatose patients can be difficult, but it is crucial to their optimal treatment. Consequently, numerous scoring systems have been developed around the world to categorize patient states based on physiological assessments. Although validated and widely adopted by medical communities, these scores still present numerous limitations and obstacles. Even with the addition of additional tests and extensions, these scoring systems have not been able to overcome certain limitations, and it appears unlikely that they will be able to do so in the future. On the other hand, physiological tests are not the only way to extract ideas about comatose patients. EEG signal analysis has helped extensively to understand the human brain and human consciousness and has been used by researchers in the classification of different levels of disease. The use of EEG in the ICU has become an urgent matter in several cases and has been recommended by medical organizations. In this field, the EEG is used to investigate epilepsy, dementia, brain injuries, and many other neurological disorders. It has recently also been used to detect pain activity in some regions of the brain, for the detection of stress levels, and to evaluate sleep quality. In our recent findings, our aim was to use multifractal analysis, a very successful method of handling multifractal signals and feature extraction, to establish a state of awareness scale for comatose patients based on their electrical brain activity. The results show that this score could be instantaneous and could overcome many limitations with which the physiological scales stock. On the contrary, multifractal analysis stands out as a highly effective tool for characterizing non-stationary and self-similar signals. It demonstrates strong performance in extracting the properties of fractal and multifractal data, including signals and images. As such, we leverage this method, along with other features derived from EEG signal recordings from comatose patients, to develop a scale. This scale aims to accurately depict the vigilance state of patients in intensive care units and to address many of the limitations inherent in physiological scales such as the Glasgow Coma Scale (GCS) and the FOUR score. The results of applying version V0 of this approach to 30 patients with known GCS showed that the EEG-based score similarly describes the states of vigilance but distinguishes between the states of 8 sedated patients where the GCS could not be applied. Therefore, our approach could show promising results with patients with disabilities, injected with painkillers, and other categories where physiological scores could not be applied.

Keywords: coma, vigilance state, EEG, multifractal analysis, feature extraction

Procedia PDF Downloads 46