Search results for: software defined storage
6844 Enhanced Dimensional Stability of Rigid PVC Foams Using Glass Fibers
Authors: Nidal H. Abu-Zahra, Murtatha M. Jamel, Parisa Khoshnoud, Subhashini Gunashekar
Abstract:
Two types of glass fibers having different lengths (1/16" and 1/32") were added into rigid PVC foams to enhance the dimensional stability of extruded rigid Polyvinyl Chloride (PVC) foam at different concentrations (0-20 phr) using a single screw profile extruder. PVC foam-glass fiber composites (PVC-GF) were characterized for their dimensional stability, structural, thermal, and mechanical properties. Experimental results show that the dimensional stability, heat resistance, and storage modulus were enhanced without compromising the tensile and flexural strengths of the composites. Overall, foam composites which were prepared with longer glass fibers exhibit better mechanical and thermal properties than those prepared with shorter glass fibers due to higher interlocking between the fibers and the foam cells, which result in better load distribution in the matrix.Keywords: polyvinyl chloride, PVC foam, PVC composites, polymer composites, glass fiber composites, reinforced polymers
Procedia PDF Downloads 3966843 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR
Authors: Pascal Mwenge, Tumisang Seodigeng
Abstract:
The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR
Procedia PDF Downloads 1486842 Treatment of Leaden Sludge of Algiers Refinery by Electrooxidation
Authors: K. Ighilahriz, M. Taleb Ahmed, R. Maachi
Abstract:
Oil industries are responsible for most cases of contamination of our ecosystem by oil and heavy metals. They are toxic and considered carcinogenic and dangerous even when they exist in trace amounts. At Algiers refinery, production, transportation, and refining of crude oil generate considerable waste in storage tanks; these residues result from the gravitational settling. The composition of these residues is essentially a mixture of hydrocarbon and lead. We propose in this work the application of electrooxidation treatment for the leachate of the leaden sludge. The effect of pH, current density and the electrolysis time were studied, the effectiveness of the processes is evaluated by measuring the chemical oxygen demand (COD). The dissolution is the best way to mobilize pollutants from leaden mud, so we conducted leaching before starting the electrochemical treatment. The process was carried out in batch mode using graphite anode and a stainless steel cathode. The results clearly demonstrate the compatibility of the technique used with the type of pollution studied. In fact, it allowed COD removal about 80%.Keywords: electrooxidation, leaching, leaden sludge, oil industry
Procedia PDF Downloads 2286841 Geospatial Technologies in Support of Civic Engagement and Cultural Heritage: Lessons Learned from Three Participatory Planning Workshops for Involving Local Communities in the Development of Sustainable Tourism Practices in Latiano, Brindisi
Authors: Mark Opmeer
Abstract:
The fruitful relationship between cultural heritage and digital technology is evident. Due to the development of user-friendly software, an increasing amount of heritage scholars use ict for their research activities. As a result, the implementation of information technology for heritage planning has become a research objective in itself. During the last decades, we have witnessed a growing debate and literature about the importance of computer technologies for the field of cultural heritage and ecotourism. Indeed, implementing digital technology in support of these domains can be very fruitful for one’s research practice. However, due to the rapid development of new software scholars may find it challenging to use these innovations in an appropriate way. As such, this contribution seeks to explore the interplay between geospatial technologies (geo-ict), civic engagement and cultural heritage and tourism. In this article, we discuss our findings on the use of geo-ict in support of civic participation, cultural heritage and sustainable tourism development in the southern Italian district of Brindisi. In the city of Latiano, three workshops were organized that involved local members of the community to distinguish and discuss interesting points of interests (POI’s) which represent the cultural significance and identity of the area. During the first workshop, a so called mappa della comunità was created on a touch table with collaborative mapping software, that allowed the participators to highlight potential destinations for tourist purposes. Furthermore, two heritage-based itineraries along a selection of identified POI’s was created to make the region attractive for recreants and tourists. These heritage-based itineraries reflect the communities’ ideas about the cultural identity of the region. Both trails were subsequently implemented in a dedicated mobile application (app) and was evaluated using a mixed-method approach with the members of the community during the second workshop. In the final workshop, the findings of the collaboration, the heritage trails and the app was evaluated with all participants. Based on our conclusions, we argue that geospatial technologies have a significant potential for involving local communities in heritage planning and tourism development. The participants of the workshops found it increasingly engaging to share their ideas and knowledge using the digital map of the touch table. Secondly, the use of a mobile application as instrument to test the heritage-based itineraries in the field was broadly considered as fun and beneficial for enhancing community awareness and participation in local heritage. The app furthermore stimulated the communities’ awareness of the added value of geospatial technologies for sustainable tourism development in the area. We conclude this article with a number of recommendations in order to provide a best practice for organizing heritage workshops with similar objectives.Keywords: civic engagement, geospatial technologies, tourism development, cultural heritage
Procedia PDF Downloads 2876840 Improving the Quality of Transport Management Services with Fuzzy Signatures
Authors: Csaba I. Hencz, István Á. Harmati
Abstract:
Nowadays the significance of road transport is gradually increasing. All transport companies are working in the same external environment where the speed of transport is defined by traffic rules. The main objective is to accelerate the speed of service and it is only dependent on the individual abilities of the managing members. These operational control units make decisions quickly (in a typically experiential and/or intuitive way). For this reason, support for these decisions is an important task. Our goal is to create a decision support model based on fuzzy signatures that can assist the work of operational management automatically. If the model sets parameters properly, the management of transport could be more economical and efficient.Keywords: freight transport, decision support, information handling, fuzzy methods
Procedia PDF Downloads 2596839 Modelling of Damage as Hinges in Segmented Tunnels
Authors: Gelacio JuáRez-Luna, Daniel Enrique GonzáLez-RamíRez, Enrique Tenorio-Montero
Abstract:
Frame elements coupled with springs elements are used for modelling the development of hinges in segmented tunnels, the spring elements modelled the rotational, transversal and axial failure. These spring elements are equipped with constitutive models to include independently the moment, shear force and axial force, respectively. These constitutive models are formulated based on damage mechanics and experimental test reported in the literature review. The mesh of the segmented tunnels was discretized in the software GID, and the nonlinear analyses were carried out in the finite element software ANSYS. These analyses provide the capacity curve of the primary and secondary lining of a segmented tunnel. Two numerical examples of segmented tunnels show the capability of the spring elements to release energy by the development of hinges. The first example is a segmental concrete lining discretized with frame elements loaded until hinges occurred in the lining. The second example is a tunnel with primary and secondary lining, discretized with a double ring frame model. The outer ring simulates the segmental concrete lining and the inner ring simulates the secondary cast-in-place concrete lining. Spring elements also modelled the joints between the segments in the circumferential direction and the ring joints, which connect parallel adjacent rings. The computed load vs displacement curves are congruent with numerical and experimental results reported in the literature review. It is shown that the modelling of a tunnel with primary and secondary lining with frame elements and springs provides reasonable results and save computational cost, comparing with 2D or 3D models equipped with smeared crack models.Keywords: damage, hinges, lining, tunnel
Procedia PDF Downloads 3906838 Study on the Integration Schemes and Performance Comparisons of Different Integrated Solar Combined Cycle-Direct Steam Generation Systems
Authors: Liqiang Duan, Ma Jingkai, Lv Zhipeng, Haifan Cai
Abstract:
The integrated solar combined cycle (ISCC) system has a series of advantages such as increasing the system power generation, reducing the cost of solar power generation, less pollutant and CO2 emission. In this paper, the parabolic trough collectors with direct steam generation (DSG) technology are considered to replace the heat load of heating surfaces in heat regenerator steam generation (HRSG) of a conventional natural gas combined cycle (NGCC) system containing a PG9351FA gas turbine and a triple pressure HRSG with reheat. The detailed model of the NGCC system is built in ASPEN PLUS software and the parabolic trough collectors with DSG technology is modeled in EBSILON software. ISCC-DSG systems with the replacement of single, two, three and four heating surfaces are studied in this paper. Results show that: (1) the ISCC-DSG systems with the replacement heat load of HPB, HPB+LPE, HPE2+HPB+HPS, HPE1+HPE2+ HPB+HPS are the best integration schemes when single, two, three and four stages of heating surfaces are partly replaced by the parabolic trough solar energy collectors with DSG technology. (2) Both the changes of feed water flow and the heat load of the heating surfaces in ISCC-DSG systems with the replacement of multi-stage heating surfaces are smaller than those in ISCC-DSG systems with the replacement of single heating surface. (3) ISCC-DSG systems with the replacement of HPB+LPE heating surfaces can increase the solar power output significantly. (4) The ISCC-DSG systems with the replacement of HPB heating surfaces has the highest solar-thermal-to-electricity efficiency (47.45%) and the solar radiation energy-to-electricity efficiency (30.37%), as well as the highest exergy efficiency of solar field (33.61%).Keywords: HRSG, integration scheme, parabolic trough collectors with DSG technology, solar power generation
Procedia PDF Downloads 2536837 A Web-Based Real Property Updating System for Efficient and Sustainable Urban Development: A Case Study in Ethiopia
Authors: Eyosiyas Aga
Abstract:
The development of information communication technology has transformed the paper-based mapping and land registration processes to a computerized and networked system. The computerization and networking of real property information system play a vital role in good governance and sustainable development of emerging countries through cost effective, easy and accessible service delivery for the customer. The efficient, transparent and sustainable real property system is becoming the basic infrastructure for the urban development thus improve the data management system and service delivery in the organizations. In Ethiopia, the real property administration is paper based as a result, it confronted problems of data management, illegal transactions, corruptions, and poor service delivery. In order to solve this problem and to facilitate real property market, the implementation of web-based real property updating system is crucial. A web-based real property updating is one of the automation (computerizations) methods to facilitate data sharing, reduce time and cost of the service delivery in real property administration system. In additions, it is useful for the integration of data onto different information systems and organizations. This system is designed by combining open source software which supported by open Geo-spatial consortium. The web-based system is mainly designed by using open source software with the help of open Geo-spatial Consortium. The Open Geo-spatial Consortium standards such as the Web Feature Service and Web Map Services are the most widely used standards to support and improves web-based real property updating. These features allow the integration of data from different sources, and it can be used to maintain consistency of data throughout transactions. The PostgreSQL and Geoserver are used to manage and connect a real property data to the flex viewer and user interface. The system is designed for both internal updating system (municipality); which is mainly updating of spatial and textual information, and the external system (customer) which focus on providing and interacting with the customer. This research assessed the potential of open source web applications and adopted this technology for real property updating system in Ethiopia through simple, cost effective and secured way. The system is designed by combining and customizing open source software to enhance the efficiency of the system in cost effective way. The existing workflow for real property updating is analyzed to identify the bottlenecks, and the new workflow is designed for the system. The requirement is identified through questionnaire and literature review, and the system is prototype for the study area. The research mainly aimed to integrate human resource with technology in designing of the system to reduce data inconsistency and security problems. In additions, the research reflects on the current situation of real property administration and contributions of effective data management system for efficient, transparent and sustainable urban development in Ethiopia.Keywords: cadaster, real property, sustainable, transparency, web feature service, web map service
Procedia PDF Downloads 2676836 A Method for Quantitative Assessment of the Dependencies between Input Signals and Output Indicators in Production Systems
Authors: Maciej Zaręba, Sławomir Lasota
Abstract:
Knowing the degree of dependencies between the sets of input signals and selected sets of indicators that measure a production system's effectiveness is of great importance in the industry. This paper introduces the SELM method that enables the selection of sets of input signals, which affects the most the selected subset of indicators that measures the effectiveness of a production system. For defined set of output indicators, the method quantifies the impact of input signals that are gathered in the continuous monitoring production system.Keywords: manufacturing operation management, signal relationship, continuous monitoring, production systems
Procedia PDF Downloads 1196835 Cross-Layer Design of Event-Triggered Adaptive OFDMA Resource Allocation Protocols with Application to Vehicle Clusters
Authors: Shaban Guma, Naim Bajcinca
Abstract:
We propose an event-triggered algorithm for the solution of a distributed optimization problem by means of the projected subgradient method. Thereby, we invoke an OFDMA resource allocation scheme by applying an event-triggered sensitivity analysis at the access point. The optimal resource assignment of the subcarriers to the involved wireless nodes is carried out by considering the sensitivity analysis of the overall objective function as defined by the control of vehicle clusters with respect to the information exchange between the nodes.Keywords: consensus, cross-layer, distributed, event-triggered, multi-vehicle, protocol, resource, OFDMA, wireless
Procedia PDF Downloads 3316834 Methodology to Affirm Driver Engagement in Dynamic Driving Task (DDT) for a Level 2 Adas Feature
Authors: Praneeth Puvvula
Abstract:
Autonomy in has become increasingly common in modern automotive cars. There are 5 levels of autonomy as defined by SAE. This paper focuses on a SAE level 2 feature which, by definition, is able to control the vehicle longitudinally and laterally at the same time. The system keeps the vehicle centred with in the lane by detecting the lane boundaries while maintaining the vehicle speed. As with the features from SAE level 1 to level 3, the primary responsibility of dynamic driving task lies with the driver. This will need monitoring techniques to ensure the driver is always engaged even while the feature is active. This paper focuses on the these techniques, which would help the safe usage of the feature and provide appropriate warnings to the driver.Keywords: autonomous driving, safety, adas, automotive technology
Procedia PDF Downloads 896833 Study of Mobile Game Addiction Using Electroencephalography Data Analysis
Authors: Arsalan Ansari, Muhammad Dawood Idrees, Maria Hafeez
Abstract:
Use of mobile phones has been increasing considerably over the past decade. Currently, it is one of the main sources of communication and information. Initially, mobile phones were limited to calls and messages, but with the advent of new technology smart phones were being used for many other purposes including video games. Despite of positive outcomes, addiction to video games on mobile phone has become a leading cause of psychological and physiological problems among many people. Several researchers examined the different aspects of behavior addiction with the use of different scales. Objective of this study is to examine any distinction between mobile game addicted and non-addicted players with the use of electroencephalography (EEG), based upon psycho-physiological indicators. The mobile players were asked to play a mobile game and EEG signals were recorded by BIOPAC equipment with AcqKnowledge as data acquisition software. Electrodes were places, following the 10-20 system. EEG was recorded at sampling rate of 200 samples/sec (12,000samples/min). EEG recordings were obtained from the frontal (Fp1, Fp2), parietal (P3, P4), and occipital (O1, O2) lobes of the brain. The frontal lobe is associated with behavioral control, personality, and emotions. The parietal lobe is involved in perception, understanding logic, and arithmetic. The occipital lobe plays a role in visual tasks. For this study, a 60 second time window was chosen for analysis. Preliminary analysis of the signals was carried out with Acqknowledge software of BIOPAC Systems. From the survey based on CGS manual study 2010, it was concluded that five participants out of fifteen were in addictive category. This was used as prior information to group the addicted and non-addicted by physiological analysis. Statistical analysis showed that by applying clustering analysis technique authors were able to categorize the addicted and non-addicted players specifically on theta frequency range of occipital area.Keywords: mobile game, addiction, psycho-physiology, EEG analysis
Procedia PDF Downloads 1646832 Slope Stability Analysis and Evaluation of Road Cut Slope in Case of Goro to Abagada Road, Adama
Authors: Ezedin Geta Seid
Abstract:
Slope failures are among the common geo-environmental natural hazards in the hilly and mountainous terrain of the world causing damages to human life and destruction of infrastructures. In Ethiopia, the demand for the construction of infrastructures, especially highways and railways, has increased to connect the developmental centers. However, the failure of roadside slopes formed due to the difficulty of geographical locations is the major difficulty for this development. As a result, a comprehensive site-specific investigation of destabilizing agents and a suitable selection of slope profiles are needed during design. Hence, this study emphasized the stability analysis and performance evaluation of slope profiles (single slope, multi-slope, and benched slope). The analysis was conducted for static and dynamic loading conditions using limit equilibrium (slide software) and finite element method (Praxis software). The analysis results in selected critical sections show that the slope is marginally stable, with FS varying from 1.2 to 1.5 in static conditions, and unstable with FS below 1 in dynamic conditions. From the comparison of analysis methods, the finite element method provides more valuable information about the failure surface of a slope than limit equilibrium analysis. Performance evaluation of geometric profiles shows that geometric modification provides better and more economical slope stability. Benching provides significant stability for cut slopes (i.e., the use of 2m and 3m bench improves the factor of safety by 7.5% and 12% from a single slope profile). The method is more effective on steep slopes. Similarly, the use of a multi-slope profile improves the stability of the slope in stratified soil with varied strength. The performance is more significant when it is used in combination with benches. The study also recommends drainage control and slope reinforcement as a remedial measure for cut slopes.Keywords: slope failure, slope profile, bench slope, multi slope
Procedia PDF Downloads 326831 A Robust Optimization of Chassis Durability/Comfort Compromise Using Chebyshev Polynomial Chaos Expansion Method
Authors: Hanwei Gao, Louis Jezequel, Eric Cabrol, Bernard Vitry
Abstract:
The chassis system is composed of complex elements that take up all the loads from the tire-ground contact area and thus it plays an important role in numerous specifications such as durability, comfort, crash, etc. During the development of new vehicle projects in Renault, durability validation is always the main focus while deployment of comfort comes later in the project. Therefore, sometimes design choices have to be reconsidered because of the natural incompatibility between these two specifications. Besides, robustness is also an important point of concern as it is related to manufacturing costs as well as the performance after the ageing of components like shock absorbers. In this paper an approach is proposed aiming to realize a multi-objective optimization between chassis endurance and comfort while taking the random factors into consideration. The adaptive-sparse polynomial chaos expansion method (PCE) with Chebyshev polynomial series has been applied to predict responses’ uncertainty intervals of a system according to its uncertain-but-bounded parameters. The approach can be divided into three steps. First an initial design of experiments is realized to build the response surfaces which represent statistically a black-box system. Secondly within several iterations an optimum set is proposed and validated which will form a Pareto front. At the same time the robustness of each response, served as additional objectives, is calculated from the pre-defined parameter intervals and the response surfaces obtained in the first step. Finally an inverse strategy is carried out to determine the parameters’ tolerance combination with a maximally acceptable degradation of the responses in terms of manufacturing costs. A quarter car model has been tested as an example by applying the road excitations from the actual road measurements for both endurance and comfort calculations. One indicator based on the Basquin’s law is defined to compare the global chassis durability of different parameter settings. Another indicator related to comfort is obtained from the vertical acceleration of the sprung mass. An optimum set with best robustness has been finally obtained and the reference tests prove a good robustness prediction of Chebyshev PCE method. This example demonstrates the effectiveness and reliability of the approach, in particular its ability to save computational costs for a complex system.Keywords: chassis durability, Chebyshev polynomials, multi-objective optimization, polynomial chaos expansion, ride comfort, robust design
Procedia PDF Downloads 1526830 Scope of Heavy Oil as a Fuel of the Future
Authors: Kiran P. Chadayamuri, Saransh Bagdi
Abstract:
Increasing imbalance between energy supply and demand has made nations and companies involved in the energy sector to boost up their research and find suitable solutions. With the high rates at which conventional oil and gas resources are depleting, efficient exploration and exploitation of heavy oil could just be the answer. Heavy oil may be defined as crude oil having API gravity value of less than 20⁰. They are highly viscous, have low hydrogen to carbon ratios and are known to produce high carbon residues. They have high contents of asphaltenes, heavy metals, sulphur and nitrogen in them. Due to these properties extraction, transportation and refining of crude oil have its share of challenges. Lack of suitable technology has hindered its production in the past, but now things are going in a more positive direction. The aim of this paper is to study the various advantages of heavy oil, associated limitations and its feasibility as a fuel of the future.Keywords: energy, heavy oil, fuel, future
Procedia PDF Downloads 2866829 Parallel 2-Opt Local Search on GPU
Authors: Wen-Bao Qiao, Jean-Charles Créput
Abstract:
To accelerate the solution for large scale traveling salesman problems (TSP), a parallel 2-opt local search algorithm with simple implementation based on Graphics Processing Unit (GPU) is presented and tested in this paper. The parallel scheme is based on technique of data decomposition by dynamically assigning multiple K processors on the integral tour to treat K edges’ 2-opt local optimization simultaneously on independent sub-tours, where K can be user-defined or have a function relationship with input size N. We implement this algorithm with doubly linked list on GPU. The implementation only requires O(N) memory. We compare this parallel 2-opt local optimization against sequential exhaustive 2-opt search along integral tour on TSP instances from TSPLIB with more than 10000 cities.Keywords: parallel 2-opt, double links, large scale TSP, GPU
Procedia PDF Downloads 6256828 A 4-Month Low-carb Nutrition Intervention Study Aimed to Demonstrate the Significance of Addressing Insulin Resistance in 2 Subjects with Type-2 Diabetes for Better Management
Authors: Shashikant Iyengar, Jasmeet Kaur, Anup Singh, Arun Kumar, Ira Sahay
Abstract:
Insulin resistance (IR) is a condition that occurs when cells in the body become less responsive to insulin, leading to higher levels of both insulin and glucose in the blood. This condition is linked to metabolic syndromes, including diabetes. It is crucial to address IR promptly after diagnosis to prevent long-term complications associated with high insulin and high blood glucose. This four-month case study highlights the importance of treating the underlying condition to manage diabetes effectively. Insulin is essential for regulating blood sugar levels by facilitating the uptake of glucose into cells for energy or storage. In IR individuals, cells are less efficient at taking up glucose from the blood resulting in elevated blood glucose levels. As a result of IR, beta cells produce more insulin to make up for the body's inability to use insulin effectively. This leads to high insulin levels, a condition known as hyperinsulinemia, which further impairs glucose metabolism and can contribute to various chronic diseases. In addition to regulating blood glucose, insulin has anti-catabolic effects, preventing the breakdown of molecules in the body, such as inhibiting glycogen breakdown in the liver, inhibiting gluconeogenesis, and inhibiting lipolysis. If a person is insulin-sensitive or metabolically healthy, an optimal level of insulin prevents fat cells from releasing fat and promotes the storage of glucose and fat in the body. Thus optimal insulin levels are crucial for maintaining energy balance and plays a key role in metabolic processes. During the four-month study, researchers looked at the impact of a low-carb dietary (LCD) intervention on two male individuals (A & B) who had Type-2 diabetes. Althoughvneither of these individuals were obese, they were both slightly overweight and had abdominal fat deposits. Before the trial began, important markers such as fasting blood glucose (FBG), triglycerides (TG), high-density lipoprotein (HDL) cholesterol, and Hba1c were measured. These markers are essential in defining metabolic health, their individual values and variability are integral in deciphering metabolic health. The ratio of TG to HDL is used as a surrogate marker for IR. This ratio has a high correlation with the prevalence of metabolic syndrome and with IR itself. It is a convenient measure because it can be calculated from a standard lipid profile and does not require more complex tests. In this four-month trial, an improvement in insulin sensitivity was observed through the ratio of TG/HDL, which, in turn, improves fasting blood glucose levels and HbA1c. For subject A, HbA1c dropped from 13 to 6.28, and for subject B, it dropped from 9.4 to 5.7. During the trial, neither of the subjects were taking any diabetic medications. The significant improvements in their health markers, such as better glucose control, along with an increase in energy levels, demonstrate that incorporating LCD interventions can effectively manage diabetes.Keywords: metabolic disorder, insulin resistance, type-2 diabetes, low-carb nutrition
Procedia PDF Downloads 406827 Care as a Situated Universal: Defining Care as a Practical Phenomenology Study
Authors: Amanda Aliende da Matta
Abstract:
This communication presents an aspect of phenomenon selection in an applied hermeneutic phenomenology study on care and vulnerability: the need to consider it as a situated universal. For that, we will first present the study and its methodology. Secondly, we will expose the need to understand phenomena as situation-defined, incorporating feminist thought. In an informatics class for 14 year olds, we explained the exercise: students have to make a 5 slide presentation about a topic of their choice. A does it on streetwear, B on Cristiano Ronaldo, C on Marvel, but J did it on Down Syndrome. Introducing it to the class, J explains the physical and cognitive differences caused by trisomy; when asked to explain it further, he says: "they are angels, teacher," and shows us a poster on his cellphone that says: if you laugh at a different child he will laugh with you because his innocence outweighs your ignorance. The anecdote shows, better than any theoretical explanation, something that some vulnerable people have; something beautiful and special but difficult to define. Let's call this something caring. The research has the main objective of accounting for the experience of caregiving in vulnerability, and it will be carried out with Applied Hermeneutic Phenomenology (AHP). The method's objective is to investigate the lived human experience in its pre-reflexive dimension to know its meaning structures. Contrary to other research methods, AHP does not produce theory about a specific context but seeks the meaning of the lived experience, in its characteristic of human experience. However, it is necessary that we understand care as defined in a concrete situation. We cannot start the research with an a priori definitive concept of care, or we would fall into the mistake of closing ourselves to only what we already know, as explained by Levinas. We incorporate, then, the notion of situated universals. Loyal to phenomenology, the definition of the phenomenon should start with an investigation of the word's etymology: the word cura, in its etymological root, means care. And care comes from the Latin word cogitātus/cōgĭto, which means "to pursue something in mind" and "to consider thoroughly." The verb cōgĭto, meanwhile, is composed of co- (altogether) and agitare (to deal with or think committedly about something, to concern oneself with) / ăgĭto (to set in motion, to move). Care, therefore, has in its origin a meditation on something, a concern about something, a verb that has a sense of action and movement. To care is to act out of concern for something/someone. This etymology, though, is not the final definition of the phenomenon, but only its skeleton. It needs to be embodied in the concrete situation to become a possible lived experience. And that means that the lived experience descriptions (LEDs) should be selected by taking into consideration how and if care was engendered in that concrete experience. Defining the phenomenon has to take into consideration situated knowledge.Keywords: applied hermeneutic phenomenology, care ethics, hermeneutics, phenomenology, situated universalism
Procedia PDF Downloads 886826 Numerical and Experimental Assessment of a PCM Integrated Solar Chimney
Authors: J. Carlos Frutos Dordelly, M. Coillot, M. El Mankibi, R. Enríquez Miranda, M. José Jimenez, J. Arce Landa
Abstract:
Natural ventilation systems have increasingly been the subject of research due to rising energetic consumption within the building sector and increased environmental awareness. In the last two decades, the mounting concern of greenhouse gas emissions and the need for an efficient passive ventilation system have driven the development of new alternative passive technologies such as ventilated facades, trombe walls or solar chimneys. The objective of the study is the assessment of PCM panels in an in situ solar chimney for the establishment of a numerical model. The PCM integrated solar chimney shows slight performance improvement in terms of mass flow rate and external temperature and outlet temperature difference. An increase of 11.3659 m3/h can be observed during low wind speed periods. Additionally, the surface temperature across the chimney goes beyond 45 °C and allows the activation of PCM panels.Keywords: energy storage, natural ventilation, phase changing materials, solar chimney, solar energy
Procedia PDF Downloads 3686825 MARISTEM: A COST Action Focused on Stem Cells of Aquatic Invertebrates
Authors: Arzu Karahan, Loriano Ballarin, Baruch Rinkevich
Abstract:
Marine invertebrates, the highly diverse phyla of multicellular organisms, represent phenomena that are either not found or highly restricted in the vertebrates. These include phenomena like budding, fission, a fusion of ramets, and high regeneration power, such as the ability to create whole new organisms from either tiny parental fragment, many of which are controlled by totipotent, pluripotent, and multipotent stem cells. Thus, there is very much that can be learned from these organisms on the practical and evolutionary levels, further resembling Darwin's words, “It is not the strongest of the species that survives, nor the most intelligent, but the one most responsive to change”. The ‘stem cell’ notion highlights a cell that has the ability to continuously divide and differentiate into various progenitors and daughter cells. In vertebrates, adult stem cells are rare cells defined as lineage-restricted (multipotent at best) with tissue or organ-specific activities that are located in defined niches and further regulate the machinery of homeostasis, repair, and regeneration. They are usually categorized by their morphology, tissue of origin, plasticity, and potency. The above description not always holds when comparing the vertebrates with marine invertebrates’ stem cells that display wider ranges of plasticity and diversity at the taxonomic and the cellular levels. While marine/aquatic invertebrates stem cells (MISC) have recently raised more scientific interest, the know-how is still behind the attraction they deserve. MISC, not only are highly potent but, in many cases, are abundant (e.g., 1/3 of the entire animal cells), do not locate in permanent niches, participates in delayed-aging and whole-body regeneration phenomena, the knowledge of which can be clinically relevant. Moreover, they have massive hidden potential for the discovery of new bioactive molecules that can be used for human health (antitumor, antimicrobial) and biotechnology. The MARISTEM COST action (Stem Cells of Marine/Aquatic Invertebrates: From Basic Research to Innovative Applications) aims to connect the European fragmented MISC community. Under this scientific umbrella, the action conceptualizes the idea for adult stem cells that do not share many properties with the vertebrates’ stem cells, organizes meetings, summer schools, and workshops, stimulating young researchers, supplying technical and adviser support via short-term scientific studies, making new bridges between the MISC community and biomedical disciplines.Keywords: aquatic/marine invertebrates, adult stem cell, regeneration, cell cultures, bioactive molecules
Procedia PDF Downloads 1696824 Fundamentals of Mobile Application Architecture
Authors: Mounir Filali
Abstract:
Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture; developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack."Keywords: mobile applications, development, architecture, technology
Procedia PDF Downloads 1056823 Relationship between Legacy of Islamic Hadith and Biodiversity
Authors: Mohsen Nouraei, Maryam Amouei
Abstract:
Islamic studies are considered in both the Quran and Hadith. Hadith is defined as a set of reports that narrated the words and behaviors of infallible persons such as the holy Prophet (pbuh) or the Infallible Imams (as). The issue of biodiversity which is the one of the most important environmental aspects is considered in the field of Hadith. The present paper has investigated biodiversity on the basis of descriptive-analytical methods and with the approach of library-documentary. The household of the Prophet (as) have referred biodiversity that were included diversity of animals, plants, climate etc. In addition, they also have emphasized on the human need to keep diversity and no damage. It should be noted that they have expressed the rights of the animals and plants for correct using of human, so that human can use these rights in conservation of diversity and their generation.Keywords: biodiversity, conservation of biodiversity, degradation of biodiversity, extinction of biodiversity
Procedia PDF Downloads 4516822 Computational Chemical-Composition of Carbohydrates in the Context of Healthcare Informatics
Authors: S. Chandrasekaran, S. Nandita, M. Shivathmika, Srikrishnan Shivakumar
Abstract:
The objective of the research work is to analyze the computational chemical-composition of carbohydrates in the context of healthcare informatics. The computation involves the representation of complex chemical molecular structure of carbohydrate using graph theory and in a deployable Chemical Markup Language (CML). The parallel molecular structure of the chemical molecules with or without other adulterants for the sake of business profit can be analyzed in terms of robustness and derivatization measures. The rural healthcare program should create awareness in malnutrition to reduce ill-effect of decomposition and help the consumers to know the level of such energy storage mixtures in a quantitative way. The earlier works were based on the empirical and wet data which can vary from time to time but cannot be made to reuse the results of mining. The work is carried out on the quantitative computational chemistry on carbohydrates to provide a safe and secure right to food act and its regulations.Keywords: carbohydrates, chemical-composition, chemical markup, robustness, food safety
Procedia PDF Downloads 3746821 An Overview of Technology Availability to Support Remote Decentralized Clinical Trials
Authors: Simone Huber, Bianca Schnalzer, Baptiste Alcalde, Sten Hanke, Lampros Mpaltadoros, Thanos G. Stavropoulos, Spiros Nikolopoulos, Ioannis Kompatsiaris, Lina Pérez- Breva, Vallivana Rodrigo-Casares, Jaime Fons-Martínez, Jeroen de Bruin
Abstract:
Developing new medicine and health solutions and improving patient health currently rely on the successful execution of clinical trials, which generate relevant safety and efficacy data. For their success, recruitment and retention of participants are some of the most challenging aspects of protocol adherence. Main barriers include: i) lack of awareness of clinical trials; ii) long distance from the clinical site; iii) the burden on participants, including the duration and number of clinical visits and iv) high dropout rate. Most of these aspects could be addressed with a new paradigm, namely the Remote Decentralized Clinical Trials (RDCTs). Furthermore, the COVID-19 pandemic has highlighted additional advantages and challenges for RDCTs in practice, allowing participants to join trials from home and not depend on site visits, etc. Nevertheless, RDCTs should follow the process and the quality assurance of conventional clinical trials, which involve several processes. For each part of the trial, the Building Blocks, existing software and technologies were assessed through a systematic search. The technology needed to perform RDCTs is widely available and validated but is yet segmented and developed in silos, as different software solutions address different parts of the trial and at various levels. The current paper is analyzing the availability of technology to perform RDCTs, identifying gaps and providing an overview of Basic Building Blocks and functionalities that need to be covered to support the described processes.Keywords: architectures and frameworks for health informatics systems, clinical trials, information and communications technology, remote decentralized clinical trials, technology availability
Procedia PDF Downloads 2186820 Liquid Waste Management in Cluster Development
Authors: Abheyjit Singh, Kulwant Singh
Abstract:
There is a gradual depletion of the water table in the earth's crust, and it is required to converse and reduce the scarcity of water. This is only done by rainwater harvesting, recycling of water and by judicially consumption/utilization of water and adopting unique treatment measures. Domestic waste is generated in residential areas, commercial settings, and institutions. Waste, in general, is unwanted, undesirable, and nevertheless an inevitable and inherent product of social, economic, and cultural life. In a cluster, a need-based system is formed where the project is designed for systematic analysis, collection of sewage from the cluster, treating it and then recycling it for multifarious work. The liquid waste may consist of Sanitary sewage/ Domestic waste, Industrial waste, Storm waste, or Mixed Waste. The sewage contains both suspended and dissolved particles, and the total amount of organic material is related to the strength of the sewage. The untreated domestic sanitary sewage has a BOD (Biochemical Oxygen Demand) of 200 mg/l. TSS (Total Suspended Solids) about 240 mg/l. Industrial Waste may have BOD and TSS values much higher than those of sanitary sewage. Another type of impurities of wastewater is plant nutrients, especially when there are compounds of nitrogen N phosphorus P in the sewage; raw sanitary contains approx. 35 mg/l Nitrogen and 10 mg/l of Phosphorus. Finally, the pathogen in the waste is expected to be proportional to the concentration of facial coliform bacteria. The coliform concentration in raw sanitary sewage is roughly 1 billion per liter. The system of sewage disposal technique has been universally applied to all conditions, which are the nature of soil formation, Availability of land, Quantity of Sewage to be disposed of, The degree of treatment and the relative cost of disposal technique. The adopted Thappar Model (India) has the following designed parameters consisting of a Screen Chamber, a Digestion Tank, a Skimming Tank, a Stabilization Tank, an Oxidation Pond and a Water Storage Pond. The screening Chamber is used to remove plastic and other solids, The Digestion Tank is designed as an anaerobic tank having a retention period of 8 hours, The Skimming Tank has an outlet that is kept 1 meter below the surface anaerobic condition at the bottom and also help in organic solid remover, Stabilization Tank is designed as primary settling tank, Oxidation Pond is a facultative pond having a depth of 1.5 meter, Storage Pond is designed as per the requirement. The cost of the Thappar model is Rs. 185 Lakh per 3,000 to 4,000 population, and the Area required is 1.5 Acre. The complete structure will linning as per the requirement. The annual maintenance will be Rs. 5 lakh per year. The project is useful for water conservation, silage water for irrigation, decrease of BOD and there will be no longer damage to community assets and economic loss to the farmer community by inundation. There will be a healthy and clean environment in the community.Keywords: collection, treatment, utilization, economic
Procedia PDF Downloads 766819 Valorization of Surveillance Data and Assessment of the Sensitivity of a Surveillance System for an Infectious Disease Using a Capture-Recapture Model
Authors: Jean-Philippe Amat, Timothée Vergne, Aymeric Hans, Bénédicte Ferry, Pascal Hendrikx, Jackie Tapprest, Barbara Dufour, Agnès Leblond
Abstract:
The surveillance of infectious diseases is necessary to describe their occurrence and help the planning, implementation and evaluation of risk mitigation activities. However, the exact number of detected cases may remain unknown whether surveillance is based on serological tests because identifying seroconversion may be difficult. Moreover, incomplete detection of cases or outbreaks is a recurrent issue in the field of disease surveillance. This study addresses these two issues. Using a viral animal disease as an example (equine viral arteritis), the goals were to establish suitable rules for identifying seroconversion in order to estimate the number of cases and outbreaks detected by a surveillance system in France between 2006 and 2013, and to assess the sensitivity of this system by estimating the total number of outbreaks that occurred during this period (including unreported outbreaks) using a capture-recapture model. Data from horses which exhibited at least one positive result in serology using viral neutralization test between 2006 and 2013 were used for analysis (n=1,645). Data consisted of the annual antibody titers and the location of the subjects (towns). A consensus among multidisciplinary experts (specialists in the disease and its laboratory diagnosis, epidemiologists) was reached to consider seroconversion as a change in antibody titer from negative to at least 32 or as a three-fold or greater increase. The number of seroconversions was counted for each town and modeled using a unilist zero-truncated binomial (ZTB) capture-recapture model with R software. The binomial denominator was the number of horses tested in each infected town. Using the defined rules, 239 cases located in 177 towns (outbreaks) were identified from 2006 to 2013. Subsequently, the sensitivity of the surveillance system was estimated as the ratio of the number of detected outbreaks to the total number of outbreaks that occurred (including unreported outbreaks) estimated using the ZTB model. The total number of outbreaks was estimated at 215 (95% credible interval CrI95%: 195-249) and the surveillance sensitivity at 82% (CrI95%: 71-91). The rules proposed for identifying seroconversion may serve future research. Such rules, adjusted to the local environment, could conceivably be applied in other countries with surveillance programs dedicated to this disease. More generally, defining ad hoc algorithms for interpreting the antibody titer could be useful regarding other human and animal diseases and zoonosis when there is a lack of accurate information in the literature about the serological response in naturally infected subjects. This study shows how capture-recapture methods may help to estimate the sensitivity of an imperfect surveillance system and to valorize surveillance data. The sensitivity of the surveillance system of equine viral arteritis is relatively high and supports its relevance to prevent the disease spreading.Keywords: Bayesian inference, capture-recapture, epidemiology, equine viral arteritis, infectious disease, seroconversion, surveillance
Procedia PDF Downloads 2986818 Radiographic Evaluation of Odontogenic Keratocyst: A 14 Years Retrospective Study
Authors: Nor Hidayah Reduwan, Jira Chindasombatjaroen, Suchaya Pornprasersuk-Damrongsri, Sopee Pomsawat
Abstract:
INTRODUCTION: Odontogenic keratocyst (OKC) remain as a controversial pathologic entity under the scrutiny of many researchers and maxillofacial surgeons alike. The high recurrence rate and relatively aggressive nature of this lesion demand a meticulous analysis of the radiographic characteristic of OKC leading to the formulation of an accurate diagnosis. OBJECTIVE: This study aims to determine the radiographic characteristic of odontogenic keratocyst (OKC) using conventional radiographs and cone beam computed tomography (CBCT) images. MATERIALS AND METHODS: Patients histopathologically diagnosed as OKC from 2003 to 2016 by Oral and Maxillofacial Pathology Department were retrospectively reviewed. Radiographs of these cases from the archives of the Department of Oral and Maxillofacial Radiology, Faculty of Dentistry Mahidol University were retrieved. Assessment of the location, shape, border, cortication, locularity, the relationship of lesion to embedded tooth, displacement of adjacent tooth, root resorption and bony expansion of the lesion were conducted. RESULTS: Radiographs of 91 patients (44 males, 47 females) with the mean age of 31 years old (10 to 84 years) were analyzed. Among all patients, 5 cases were syndromic patients. Hence, a total of 103 OKCs were studied. The most common location was at the ramus of mandible (32%) followed by posterior maxilla (29%). Most cases presented as a well-defined unilocular radiolucency with smooth and corticated border. The lesion was in associated with embedded tooth in 48 lesions (47%). Eighty five percent of embedded tooth are impacted 3rd molar. Thirty-seven percentage of embedded tooth were entirely encapsulated in the lesion. The lesion attached to the embedded tooth at the cementoenamel junction (CEJ) in 40% and extended to part of root in 23% of cases. Teeth displacement and root resorption were found in 29% and 6% of cases, respectively. Bony expansion in bucco-lingual dimension was seen in 63% of cases. CONCLUSION: OKCs were predominant in the posterior region of the mandible with radiographic features of a well-defined, unilocular radiolucency with smooth and corticated margin. The lesions might relate to an embedded tooth by surrounding an entire tooth, attached to the CEJ level or extending to part of root. Bony expansion could be found but teeth displacement and root resorption were not common. These features might help in giving the differential diagnosis.Keywords: cone beam computed tomography, imaging dentistry, odontogenic keratocyst, radiographic features
Procedia PDF Downloads 1286817 Effect of Different Model Drugs on the Properties of Model Membranes from Fishes
Authors: M. Kumpugdee-Vollrath, T. G. D. Phu, M. Helmis
Abstract:
A suitable model membrane to study the pharmacological effect of pharmaceutical products is human stratum corneum because this layer of human skin is the outermost layer and it is an important barrier to be passed through. Other model membranes which were also used are for example skins from pig, mouse, reptile or fish. We are interested in fish skins in this project. The advantages of the fish skins are, that they can be obtained from the supermarket or fish shop. However, the fish skins should be freshly prepared and used directly without storage. In order to understand the effect of different model drugs e.g. lidocaine HCl, resveratrol, paracetamol, ibuprofen, acetyl salicylic acid on the properties of the model membrane from various types of fishes e.g. trout, salmon, cod, plaice permeation tests were performed and differential scanning calorimetry was applied.Keywords: fish skin, model membrane, permeation, DSC, lidocaine HCl, resveratrol, paracetamol, ibuprofen, acetyl salicylic acid
Procedia PDF Downloads 4706816 Multi-Label Approach to Facilitate Test Automation Based on Historical Data
Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally
Abstract:
The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.Keywords: machine learning, multi-class, multi-label, supervised learning, test automation
Procedia PDF Downloads 1326815 Radioactivity Assessment of Sediments in Negombo Lagoon Sri Lanka
Authors: H. M. N. L. Handagiripathira
Abstract:
The distributions of naturally occurring and anthropogenic radioactive materials were determined in surface sediments taken at 27 different locations along the bank of Negombo Lagoon in Sri Lanka. Hydrographic parameters of lagoon water and the grain size analyses of the sediment samples were also carried out for this study. The conductivity of the adjacent water was varied from 13.6 mS/cm to 55.4 mS/cm near to the southern end and the northern end of the lagoon, respectively, and equally salinity levels varied from 7.2 psu to 32.1 psu. The average pH in the water was 7.6 and average water temperature was 28.7 °C. The grain size analysis emphasized the mass fractions of the samples as sand (60.9%), fine sand (30.6%) and fine silt+clay (1.3%) in the sampling locations. The surface sediment samples of wet weight, 1 kg each from upper 5-10 cm layer, were oven dried at 105 °C for 24 hours to get a constant weight, homogenized and sieved through a 2 mm sieve (IAEA technical series no. 295). The radioactivity concentrations were determined using gamma spectrometry technique. Ultra Low Background Broad Energy High Purity Ge Detector, BEGe (Model BE5030, Canberra) was used for radioactivity measurement with Canberra Industries' Laboratory Source-less Calibration Software (LabSOCS) mathematical efficiency calibration approach and Geometry composer software. The mean activity concentration was found to be 24 ± 4, 67 ± 9, 181 ± 10, 59 ± 8, 3.5 ± 0.4 and 0.47 ± 0.08 Bq/kg for 238U, 232Th, 40K, 210Pb, 235U and 137Cs respectively. The mean absorbed dose rate in air, radium equivalent activity, external hazard index, annual gonadal dose equivalent and annual effective dose equivalent were 60.8 nGy/h, 137.3 Bq/kg, 0.4, 425.3 mSv/year and 74.6 mSv/year, respectively. The results of this study will provide baseline information on the natural and artificial radioactive isotopes and environmental pollution associated with information on radiological risk.Keywords: gamma spectrometry, lagoon, radioactivity, sediments
Procedia PDF Downloads 139