Search results for: non-equilibrium Green’s function
1389 Devotional Informant and Diagenetic Alterations, Influences of Facies and Fine Kaolinite Formation Migration on Sandstone’ Reservoir Quality, Sarir Formation, Sirt
Authors: Faraj M. Elkhatri, Hana Ellafi
Abstract:
In recent years, there has been a growing recognition of the potential of marine-based functional foods and combination therapies in promoting a healthy lifestyle and exploring their effectiveness in preventing or treating diseases. The combination of marine bioactive compounds or extracts offers synergistic or enhancement effects through various mechanisms, including multi-target actions, improved bioavailability, enhanced bioactivity, and mitigation of potential adverse effects. Both the green-lipped mussel (GLM) and fucoidan derived from brown seaweed are rich in bioactivities. These two, mussel and fucoidan, have not been previously formulated together. This study aims to combine GLM oil from Perna canaliculus with low molecular weight fucoidan (LMWF) extracted from Undaria pinnatifida to investigate the unique mixture’s anti-inflammatory and antioxidant properties. The cytotoxicity of individual compounds and combinations was assessed using the MTT assay in (THP-1 and RAW264.7) cell lines. The anti-inflammatory activity of mussel-fucoidan was evaluated by treating LPS-stimulated human monocyte and macrophage (THP1-1) cells. Subsequently, the inflammatory cytokines released into the supernatant of these cell lines were quantified via ELISA. Antioxidant activity was determined by using the free radical scavenging assay (DPPH). DPPH assay demonstrated that the radical scavenging activity of the combinations, particularly at concentrations exceeding 1 mg/ml, showed a significantly higher percentage of inhibition when compared to the individual component. This suggests an enhancement effect when the two compounds are combined, leading to increased antioxidant activity. In terms of immunomodulatory activity, the individual compounds exhibited distinct behaviors. GLM oil displayed a higher ability to suppress the cytokine TNF- compared to LMWF. Interestingly, the LMWF fraction, when used individually, did not demonstrate TNF- suppression. However, when combined with GLM, the TNF- suppression (anti-inflammatory) activity of the combination was better than GLM or LWMF alone. This observation underscores the potential for enhancement interactions between the two components in terms of anti-inflammatory properties. This study revealed that each individual compound, LMWF, and GLM, possesses unique and notable bioactivity. The combination of these two individual compounds results in an enhancement effect, where the bioactivity of each is enhanced, creating a superior combination. This suggests that the combination of LMWF and GLM has the potential to offer a more potent and multifaceted therapeutic effect, particularly in the context of antioxidant and anti-inflammatory activities. These findings hold promise for the development of novel therapeutic interventions or supplements that harness the enhancement effects.Keywords: formation damage, porosity loses, pore throat, quartz cement
Procedia PDF Downloads 561388 Simulation-Based Validation of Safe Human-Robot-Collaboration
Authors: Titanilla Komenda
Abstract:
Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.Keywords: human-machine-system, human-robot-collaboration, safety, simulation
Procedia PDF Downloads 3611387 Design and Fabrication of Stiffness Reduced Metallic Locking Compression Plates through Topology Optimization and Additive Manufacturing
Authors: Abdulsalam A. Al-Tamimi, Chris Peach, Paulo Rui Fernandes, Paulo J. Bartolo
Abstract:
Bone fixation implants currently used to treat traumatic fractured bones and to promote fracture healing are built with biocompatible metallic materials such as stainless steel, cobalt chromium and titanium and its alloys (e.g., CoCrMo and Ti6Al4V). The noticeable stiffness mismatch between current metallic implants and host bone associates with negative outcomes such as stress shielding which causes bone loss and implant loosening leading to deficient fracture treatment. This paper, part of a major research program to design the next generation of bone fixation implants, describes the combined use of three-dimensional (3D) topology optimization (TO) and additive manufacturing powder bed technology (Electron Beam Melting) to redesign and fabricate the plates based on the current standard one (i.e., locking compression plate). Topology optimization is applied with an objective function to maximize the stiffness and constraint by volume reductions (i.e., 25-75%) in order to obtain optimized implant designs with reduced stress shielding phenomenon, under different boundary conditions (i.e., tension, bending, torsion and combined loads). The stiffness of the original and optimised plates are assessed through a finite-element study. The TO results showed actual reduction in the stiffness for most of the plates due to the critical values of volume reduction. Additionally, the optimized plates fabricated using powder bed techniques proved that the integration between the TO and additive manufacturing presents the capability of producing stiff reduced plates with acceptable tolerances.Keywords: additive manufacturing, locking compression plate, finite element, topology optimization
Procedia PDF Downloads 1971386 Polymorphisms of the UM Genotype of CYP2C19*17 in Thais Taking Medical Cannabis
Authors: Athicha Cherdpunt, Patompong Satapornpong
Abstract:
The medical cannabis is made up of components also known as cannabinoids, which consists of two ingredients which are Δ9-tetrahydrocannabinol (THC) and cannabidiol (CBD). Interestingly, the Cannabinoid can be used for many treatments such as chemotherapy, including nausea and vomiting, cachexia, anorexia nervosa, spinal cord injury and disease, epilepsy, pain, and many others. However, the adverse drug reactions (ADRs) of THC can cause sedation, anxiety, dizziness, appetite stimulation and impairments in driving and cognitive function. Furthermore, genetic polymorphisms of CYP2C9, CYP2C19 and CYP3A4 influenced the THC metabolism and might be a cause of ADRs. Particularly, CYP2C19*17 allele increases gene transcription and therefore results in ultra-rapid metabolizer phenotype (UM). The aim of this study, is to investigate the frequency of CYP2C19*17 alleles in Thai patients who have been treated with medical cannabis. We prospectively enrolled 60 Thai patients who were treated with medical cannabis and clinical data from College of Pharmacy, Rangsit University. DNA of each patient was isolated from EDTA blood, using the Genomic DNA Mini Kit. CYP2C19*17 genotyping was conducted using the real time-PCR ViiA7 (ABI, Foster City, CA, USA). 30 patients with medical cannabis-induced ADRs group, 20 (67%) were female, and 10 (33%) were male, with an age range of 30-69 years. On the other hand, 30 patients without medical cannabis-induced ADRs (control group) consist of 17 (57%) female and 13 (43%) male. The most ADRs for medical cannabis treatment in the case group were dry mouth and dry throat (77%), tachycardia (70%), nausea (30%) and arrhythmia(10%). Accordingly, the case group carried CYP2C19*1/*1 (normal metabolizer) approximately 93%, while 7% patients carrying CYP2C19*1/*17 (ultra rapid metabolizers) exhibited in this group. Meanwhile, we found 90% of CYP2C19*1/*1 and 10% of CYP2C19*1/*17 in control group. In this study, we identified the frequency of CYP2C19*17 allele in Thai population which will support the pharmacogenetics biomarkers for screening and avoid ADRs of medical cannabis treatment.Keywords: CYP2C19, allele frequency, ultra rapid metabolizer, medical cannabis
Procedia PDF Downloads 1091385 Context and Culture in EFL Learners' and Native Speakers' Discourses
Authors: Emad A. S. Abu-Ayyash
Abstract:
Cohesive devices, the linguistic tools that are usually employed to hold the different parts of the text together, have been the focus of a significant number of discourse analysis studies. These linguistic tools have grabbed the attention of researchers since the inception of the first and most comprehensive model of cohesion in 1976. However, it was noticed that some cohesive devices (e.g., endophoric reference, conjunctions, ellipsis, substitution, and lexical ties) – being thought of as more popular than others (e.g., exophoric reference) – were over-researched. The present paper explores the usage of two cohesive devices that have been evidently almost absent from discourse analysis studies. These cohesive devices are exophoric and homophoric references, the linguistic items that can be interpreted in terms of the physical and cultural contexts of discourse. The significance of the current paper, therefore, stems from the fact that it attempts to fill a gap in the research conducted so far on cohesive devices. This study provides an explanation of the concepts of the cohesive devices that have been employed in a plethora of research on cohesion and elucidates the relevant context-related concepts. The paper also identifies the gap in cohesive devices research. Exophora and homophora, the least visited cohesive devices in previous studies, were qualitatively and quantitatively explored in six opinion articles, four produced by eight postgraduate English as a Foreign Language (EFL) students in a university in the United Arab Emirates and two by professional NS writers in the Independent and the Guardian. The six pieces were about the United Kingdom Independent Party (UKIP) leader’s call to ban the burqa in the UK and were analysed vis-a-vis the employment and function of homophora and exophora. The study found that both EFL students and native speakers employed exophora and homophora considerably in their writing to serve a variety of functions, including building assumptions, supporting main ideas, and involving the readers among others.Keywords: cohesive devices, context, culture, exophoric reference, homophoric reference
Procedia PDF Downloads 1231384 Optimization of Biomass Production and Lipid Formation from Chlorococcum sp. Cultivation on Dairy and Paper-Pulp Wastewater
Authors: Emmanuel C. Ngerem
Abstract:
The ever-increasing depletion of the dominant global form of energy (fossil fuels) calls for the development of sustainable and green alternative energy sources such as bioethanol, biohydrogen, and biodiesel. The production of the major biofuels relies on biomass feedstocks that are mainly derived from edible food crops and some inedible plants. One suitable feedstock with great potential as raw material for biofuel production is microalgal biomass. Despite the tremendous attributes of microalgae as a source of biofuel, their cultivation requires huge volumes of freshwater, thus posing a serious threat to commercial-scale production and utilization of algal biomass. In this study, a multi-media wastewater mixture for microalgae growth was formulated and optimized. Moreover, the obtained microalgae biomass was pre-treated to reduce sugar recovery and was compared with previous studies on microalgae biomass pre-treatment. The formulated and optimized mixed wastewater media for biomass and lipid accumulation was established using the simplex lattice mixture design. Based on the superposition approach of the potential results, numerical optimization was conducted, followed by the analysis of biomass concentration and lipid accumulation. The coefficients of regression (R²) of 0.91 and 0.98 were obtained for biomass concentration and lipid accumulation models, respectively. The developed optimization model predicted optimal biomass concentration and lipid accumulation of 1.17 g/L and 0.39 g/g, respectively. It suggested 64.69% dairy wastewater (DWW) and 35.31% paper and pulp wastewater (PWW) mixture for biomass concentration, 34.21% DWW, and 65.79% PWW for lipid accumulation. Experimental validation generated 0.94 g/L and 0.39 g/g of biomass concentration and lipid accumulation, respectively. The obtained microalgae biomass was pre-treated, enzymatically hydrolysed, and subsequently assessed for reducing sugars. The optimization of microwave pre-treatment of Chlorococcum sp. was achieved using response surface methodology (RSM). Microwave power (100 – 700 W), pre-treatment time (1 – 7 min), and acid-liquid ratio (1 – 5%) were selected as independent variables for RSM optimization. The optimum conditions were achieved at microwave power, pre-treatment time, and acid-liquid ratio of 700 W, 7 min, and 32.33:1, respectively. These conditions provided the highest amount of reducing sugars at 10.73 g/L. Process optimization predicted reducing sugar yields of 11.14 g/L on microwave-assisted pre-treatment of 2.52% HCl for 4.06 min at 700 watts. Experimental validation yielded reducing sugars of 15.67 g/L. These findings demonstrate that dairy wastewater and paper and pulp wastewater that could pose a serious environmental nuisance. They could be blended to form a suitable microalgae growth media, consolidating the potency of microalgae as a viable feedstock for fermentable sugars. Also, the outcome of this study supports the microalgal wastewater biorefinery concept, where wastewater remediation is coupled with bioenergy production.Keywords: wastewater cultivation, mixture design, lipid, biomass, nutrient removal, microwave, Chlorococcum, raceway pond, fermentable sugar, modelling, optimization
Procedia PDF Downloads 401383 Eco-Literacy and Pedagogical Praxis in the Multidisciplinary University Greenhouse toward the Food Security Strengthening
Authors: Citlali Aguilera Lira, David Lynch Steinicke, Andrea León García
Abstract:
One of the challenges that higher education faces is to find how to approach the sustainability in an inclusive way to the student within all the different academic areas, how to move the sustainable development from the abstract field to the operational field. This research comes from the ecoliteracy and the pedagogical praxis as tools for rebuilding the teaching processes inside of universities. The purpose is to determine and describe which are the factors involved in the process of learning particularly in the Greenhouse-School Siembra UV. In the Greenhouse-School Siembra UV, of the University of Veracruz, are cultivated vegetables, medicinal plants and small cornfields under the usage of eco-technologies such as hydroponics, Wickingbed and Hugelkultur, which main purpose is the saving of space, labor and natural resources, as well as function as agricultural production alternatives in the urban and periurban zones. The sample was formed with students from different academic areas and who are actively involved in the greenhouse, as well as institutes from the University of Veracruz and governmental and non-governmental departments. This project comes from a pedagogic praxis approach, from filling the needs that the different professional profiles of the university students have. All this with the purpose of generate a pragmatic dialogue with the sustainability. It also comes from the necessity to understand the factors that intervene in the students’ praxis. In this manner is how the students are the fundamental unit in the sphere of sustainability. As a result, it is observed that those University of Veracruz students who are involved in the Greenhouse-school, Siembra UV, have enriched in different levels the sense of urban and periurban agriculture because of the diverse academic approaches they have and the interaction between them. It is concluded that the eco-technologies act as fundamental tools for ecoliteracy in society, where it is strengthen the nutritional and food security from a sustainable development approach.Keywords: farming eco-technologies, food security, multidisciplinary, pedagogical praxis
Procedia PDF Downloads 3171382 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems
Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen
Abstract:
Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis
Procedia PDF Downloads 5651381 Control of a Quadcopter Using Genetic Algorithm Methods
Authors: Mostafa Mjahed
Abstract:
This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system
Procedia PDF Downloads 4311380 Influence of Distribution of Body Fat on Cholesterol Non-HDL and Its Effect on Kidney Filtration
Authors: Magdalena B. Kaziuk, Waldemar Kosiba
Abstract:
Background: In the XXI century we have to deal with the epidemic of obesity which is important risk factor for the cardiovascular and kidney diseases. Lipo proteins are directly involved in the atherosclerotic process. Non-high-density lipo protein (non-HDL) began following widespread recognition of its superiority over LDL as a measurement of vascular event risk. Non-HDL includes residual risk which persists in patients after achieved recommended level of LDL. Materials and Methods: The study covered 111 patients (52 females, 59 males, age 51,91±14 years), hospitalized on the intern department. Body composition was assessed using the bioimpendance method and anthropometric measurements. Physical activity data were collected during the interview. The nutritional status and the obesity type were determined with the Waist to Height Ratio and the Waist to Hip Ratio. A function of the kidney was evaluated by calculating the estimated glomerular filtration rate (eGFR) using MDRD formula. Non-HDL was calculated as a difference between concentration of the Total and HDL cholesterol. Results: 10% of patients were found to be underweight; 23.9 % had correct body weight; 15,08 % had overweight, while the remaining group had obesity: 51,02 %. People with the android shape have higher non-HDL cholesterol versus with the gynoid shape (p=0.003). The higher was non-HDL, the lower eGFR had studied subjects (p < 0.001). Significant correlation was found between high non-HDL and incorrect dietary habits in patients avoiding eating vegetables, fruits and having low physical activity (p < 0.005). Conclusions: Android type of figure raises the residual risk of the heart disease associated with higher levels of non-HDL. Increasing physical activity in these patients reduces the level of non-HDL. Non-HDL seems to be the best predictor among all cholesterol measures for the cardiovascular events and worsening eGFR.Keywords: obesity, non-HDL cholesterol, glomerular filtration rate, lifestyle
Procedia PDF Downloads 3731379 Creep Behaviour of Asphalt Modified by Waste Polystyrene and Its Hybrids with Crumb Rubber and Low-Density Polyethylene
Authors: Soheil Heydari, Ailar Hajimohammadi, Nasser Khalili
Abstract:
Polystyrene, being made from a monomer called styrene, is a rigid and easy-to mould polymer that is widely used for many applications, from foam packaging to disposable containers. Considering that the degradation of waste polystyrene takes up to 500 years, there is an urgent need for a sustainable application for waste polystyrene. This study evaluates the application of waste polystyrene as an asphalt modifier. The inclusion of waste plastics in asphalt is either practised by the dry process or the wet process. In the dry process, plastics are added straight into the asphalt mixture and in the wet process, they are mixed and digested into bitumen. In this article, polystyrene was used as an asphalt modifier in a dry process. However, the mixing process is precisely designed to make sure that the polymer is melted and modified in the binder. It was expected that, due to the rigidity of polystyrene, it will have positive effects on the permanent deformation of the asphalt mixture. Therefore, different mixtures were manufactured with different contents of polystyrene and Marshall specimens were manufactured, and dynamic creep tests were conducted to evaluate the permanent deformation of the modification. This is a commonly repeated loading test conducted at different stress levels and temperatures. Loading cycles are applied to the AC specimen until failure occurs; with the amount of deformation constantly recorded the cumulative, permanent strain is determined and reported as a function of the number of cycles. Also, to our best knowledge, hybrid mixes of polystyrene with crumb rubber and low-density polyethylene were made and compared with a polystyrene-modified mixture. The test results of this study showed that the hybrid mix of polystyrene and low-density polyethylene has the highest resistance against permanent deformation. However, the polystyrene-modified mixture outperformed the hybrid mix of polystyrene and crumb rubber, and both demonstrated way lower permanent deformation than the unmodified specimen.Keywords: permanent deformation, waste plastics, polystyrene, hybrid plastics, hybrid mix, hybrid modification, dry process
Procedia PDF Downloads 1051378 Evaluation of Possible Application of Cold Energy in Liquefied Natural Gas Complexes
Authors: А. I. Dovgyalo, S. O. Nekrasova, D. V. Sarmin, A. A. Shimanov, D. A. Uglanov
Abstract:
Usually liquefied natural gas (LNG) gasification is performed due to atmospheric heat. In order to produce a liquefied gas a sufficient amount of energy is to be consumed (about 1 kW∙h for 1 kg of LNG). This study offers a number of solutions, allowing using a cold energy of LNG. In this paper it is evaluated the application turbines installed behind the evaporator in LNG complex due to its work additional energy can be obtained and then converted into electricity. At the LNG consumption of G=1000kg/h the expansion work capacity of about 10 kW can be reached. Herewith-open Rankine cycle is realized, where a low capacity cryo-pump (about 500W) performs its normal function, providing the cycle pressure. Additionally discussed an application of Stirling engine within the LNG complex also gives a possibility to realize cold energy. Considering the fact, that efficiency coefficient of Stirling engine reaches 50 %, LNG consumption of G=1000 kg/h may result in getting a capacity of about 142 kW of such a thermal machine. The capacity of the pump, required to compensate pressure losses when LNG passes through the hydraulic channel, will make 500 W. Apart from the above-mentioned converters, it can be proposed to use thermoelectric generating packages (TGP), which are widely used now. At present, the modern thermoelectric generator line provides availability of electric capacity with coefficient of efficiency up to 15%. In the proposed complex, it is suggested to install the thermoelectric generator on the evaporator surface is such a way, that the cold end is contacted with the evaporator’s surface, and the hot one – with the atmosphere. At the LNG consumption of G=1000 kgг/h and specified coefficient of efficiency the capacity of the heat flow Qh will make about 32 kW. The derivable net electric power will be P=4,2 kW, and the number of packages will amount to about 104 pieces. The carried out calculations demonstrate the research perceptiveness in this field of propulsion plant development, as well as allow realizing the energy saving potential with the use of liquefied natural gas and other cryogenics technologies.Keywords: cold energy, gasification, liquefied natural gas, electricity
Procedia PDF Downloads 2731377 Regeneration Study on the Athens City Center: Transformation of the Historical Triangle to “Low Pollution and Restricted Vehicle Traffic Zone”
Authors: Chondrogianni Dimitra, Yorgos J. Stephanedes
Abstract:
The impact of the economic crisis, coupled with the aging of the city's old core, is reflected in central Athens. Public and private users, residents, employees, visitors desire the quality upgrading of abandoned buildings and public spaces through environmental upgrading and sustainable mobility, and promotion of the international metropolitan character of the city. In the study, a strategy for reshaping the character and function of the historic Athenian triangle is proposed, aiming at its economic, environmental, and social sustainable development through feasible, meaningful, and non-landscaping solutions of low cost and high positive impact. Sustainable mobility is the main principle in re-planning the study area and transforming it into a “Low Pollution and Limited Vehicle Traffic Zone” is the main strategy. Τhe proposed measures include the development of pedestrian mobility networks by expanding the pedestrian roads and limited-traffic routes, of bicycle networks based on the approved Metropolitan Bicycle Route of Athens, of public transportation networks with new lines of electric mini-buses, and of new regulations for vehicle mobility in the historic triangle. In addition, complementary actions are proposed regarding the provision of Wi-Fi on fixed track media, development of applications that facilitate combined travel and provide real-time data, integration of micromobility (roller skates, Segway, Hoverboard), and its enhancement as a flexible means of personal mobility, and development of car-sharing, ride-sharing and dynamic carpooling initiatives.Keywords: regeneration plans, sustainable mobility, environmental upgrading, athens historical triangle
Procedia PDF Downloads 1671376 Spironolactone in Psoriatic Arthritis: Safety, Efficacy and Effect on Disease Activity
Authors: Ashit Syngle, Inderjit Verma, Pawan Krishan
Abstract:
Therapeutic approaches used previously relied on disease-modifying antirheumatic drugs (DMARDs) that had only partial clinical benefit and were associated with significant toxicity. Spironolactone, an oral aldosterone antagonist, suppresses inflammatory mediators. Clinical efficacy of spironolactone compared with placebo in patients with active psoriatic arthritis despite treatment with prior traditional DMARDs. In the 24-week, placebo-controlled study patients (n=31) were randomized to placebo and spironolactone (2 m/kg/day). Patients on background concurrent DMARDs continued stable doses (methotrexate, leflunomide, and/or sulfasalazine). Primary outcome measures were the assessment of disease activity measures i.e. 28-joint disease activity score (DAS28) and diseases activity in psoriatic arthritis (DAPSA) at week 24. The key secondary endpoint was change from baseline in Health Assessment Questionnaire–Disability Index (HAQ-DI) at week 24. Additional efficacy outcome measures at week 24 included improvements in the markers of inflammation (ESR and CRP) and pro-inflammatory cytokines TNF-α, IL-6 and IL-1. At week 24, spironolactone significantly reduced disease activity measure DAS-28 (p<0.001) and DAPSA (p=0.001) compared with placebo. Significant improvements in key secondary measures HAQ-DI (disability index) were evident with spironolactone (p=0.02) versus placebo. After week 24, there was significant reduction in pro-inflammatory cytokines level TNF-α, IL-6 (p<0.01) as compared with placebo group. However, there was no significant improvement in IL-1 in both treatment and placebo groups. There were minor side effects which did not mandate stopping of spironolactone. No change in any biochemical profile was noted after spironolactone treatment. Spironolactone was effective in the treatment of PsA, improving disease activity, physical function and suppressing the level of pro-inflammatory cytokines. Spironolactone demonstrated an acceptable safety profile and was well tolerated.Keywords: spironolactone, inflammation, inflammatory cytokine, psoriatic arthritis
Procedia PDF Downloads 3371375 Efficient Compact Micro Dielectric Barrier Discharge (DBD) Plasma Reactor for Ozone Generation for Industrial Application in Liquid and Gas Phase Systems
Authors: D. Kuvshinov, A. Siswanto, J. Lozano-Parada, W. Zimmerman
Abstract:
Ozone is well known as a powerful fast reaction rate oxidant. The ozone based processes produce no by-product left as a non-reacted ozone returns back to the original oxygen molecule. Therefore an application of ozone is widely accepted as one of the main directions for a sustainable and clean technologies development. There are number of technologies require ozone to be delivered to specific points of a production network or reactors construction. Due to space constrains, high reactivity and short life time of ozone the use of ozone generators even of a bench top scale is practically limited. This requires development of mini/micro scale ozone generator which can be directly incorporated into production units. Our report presents a feasibility study of a new micro scale rector for ozone generation (MROG). Data on MROG calibration and indigo decomposition at different operation conditions are presented. At selected operation conditions with residence time of 0.25 s the process of ozone generation is not limited by reaction rate and the amount of ozone produced is a function of power applied. It was shown that the MROG is capable to produce ozone at voltage level starting from 3.5kV with ozone concentration of 5.28E-6 (mol/L) at 5kV. This is in line with data presented on numerical investigation for a MROG. It was shown that in compare to a conventional ozone generator, MROG has lower power consumption at low voltages and atmospheric pressure. The MROG construction makes it applicable for emerged and dry systems. With a robust compact design MROG can be used as incorporated unit for production lines of high complexity.Keywords: dielectric barrier discharge (DBD), micro reactor, ozone, plasma
Procedia PDF Downloads 3381374 Devulcanization of Waste Rubber Using Thermomechanical Method Combined with Supercritical CO₂
Authors: L. Asaro, M. Gratton, S. Seghar, N. Poirot, N. Ait Hocine
Abstract:
Rubber waste disposal is an environmental problem. Particularly, many researches are centered in the management of discarded tires. In spite of all different ways of handling used tires, the most common is to deposit them in a landfill, creating a stock of tires. These stocks can cause fire danger and provide ambient for rodents, mosquitoes and other pests, causing health hazards and environmental problems. Because of the three-dimensional structure of the rubbers and their specific composition that include several additives, their recycling is a current technological challenge. The technique which can break down the crosslink bonds in the rubber is called devulcanization. Strictly, devulcanization can be defined as a process where poly-, di-, and mono-sulfidic bonds, formed during vulcanization, are totally or partially broken. In the recent years, super critical carbon dioxide (scCO₂) was proposed as a green devulcanization atmosphere. This is because it is chemically inactive, nontoxic, nonflammable and inexpensive. Its critical point can be easily reached (31.1 °C and 7.38 MPa), and residual scCO₂ in the devulcanized rubber can be easily and rapidly removed by releasing pressure. In this study thermomechanical devulcanization of ground tire rubber (GTR) was performed in a twin screw extruder under diverse operation conditions. Supercritical CO₂ was added in different quantities to promote the devulcanization. Temperature, screw speed and quantity of CO₂ were the parameters that were varied during the process. The devulcanized rubber was characterized by its devulcanization percent and crosslink density by swelling in toluene. Infrared spectroscopy (FTIR) and Gel permeation chromatography (GPC) were also done, and the results were related with the Mooney viscosity. The results showed that the crosslink density decreases as the extruder temperature and speed increases, and, as expected, the soluble fraction increase with both parameters. The Mooney viscosity of the devulcanized rubber decreases as the extruder temperature increases. The reached values were in good correlation (R= 0.96) with de the soluble fraction. In order to analyze if the devulcanization was caused by main chains or crosslink scission, the Horikx's theory was used. Results showed that all tests fall in the curve that corresponds to the sulfur bond scission, which indicates that the devulcanization has successfully happened without degradation of the rubber. In the spectra obtained by FTIR, it was observed that none of the characteristic peaks of the GTR were modified by the different devulcanization conditions. This was expected, because due to the low sulfur content (~1.4 phr) and the multiphasic composition of the GTR, it is very difficult to evaluate the devulcanization by this technique. The lowest crosslink density was reached with 1 cm³/min of CO₂, and the power consumed in that process was also near to the minimum. These results encourage us to do further analyses to better understand the effect of the different conditions on the devulcanization process. The analysis is currently extended to monophasic rubbers as ethylene propylene diene monomer rubber (EPDM) and natural rubber (NR).Keywords: devulcanization, recycling, rubber, waste
Procedia PDF Downloads 3851373 Analytical Slope Stability Analysis Based on the Statistical Characterization of Soil Shear Strength
Authors: Bernardo C. P. Albuquerque, Darym J. F. Campos
Abstract:
Increasing our ability to solve complex engineering problems is directly related to the processing capacity of computers. By means of such equipments, one is able to fast and accurately run numerical algorithms. Besides the increasing interest in numerical simulations, probabilistic approaches are also of great importance. This way, statistical tools have shown their relevance to the modelling of practical engineering problems. In general, statistical approaches to such problems consider that the random variables involved follow a normal distribution. This assumption tends to provide incorrect results when skew data is present since normal distributions are symmetric about their means. Thus, in order to visualize and quantify this aspect, 9 statistical distributions (symmetric and skew) have been considered to model a hypothetical slope stability problem. The data modeled is the friction angle of a superficial soil in Brasilia, Brazil. Despite the apparent universality, the normal distribution did not qualify as the best fit. In the present effort, data obtained in consolidated-drained triaxial tests and saturated direct shear tests have been modeled and used to analytically derive the probability density function (PDF) of the safety factor of a hypothetical slope based on Mohr-Coulomb rupture criterion. Therefore, based on this analysis, it is possible to explicitly derive the failure probability considering the friction angle as a random variable. Furthermore, it is possible to compare the stability analysis when the friction angle is modelled as a Dagum distribution (distribution that presented the best fit to the histogram) and as a Normal distribution. This comparison leads to relevant differences when analyzed in light of the risk management.Keywords: statistical slope stability analysis, skew distributions, probability of failure, functions of random variables
Procedia PDF Downloads 3381372 Anaphora and Cataphora on the Selected State of the City Addresses of the Mayor of Dapitan
Authors: Mark Herman Sumagang Potoy
Abstract:
State of the City Address (SOCA) is a speech, modelled after the State of the Nation Address, given not as mandated by law but usually a matter of practice or tradition delivered before the chief executive’s constituents. Through this, the general public is made to know the performance of the local government unit and its agenda for the coming year. Therefore, it is imperative for SOCAs to clearly convey its message and carry out the myriad function of enlightening its readers which could be achieved through the proper use of reference. Anaphora and cataphora are the two major types of reference; the former refer back to something that has already been mentioned while the latter points forward to something which is yet to be said. This paper seeks to identify the types of reference employed on the SOCAs from 2014 to 2016 of Hon. Rosalina Garcia Jalosjos, Mayor of Dapitan City and look into how the references contribute to the clarity of the message of the text. The qualitative method of research is used in this study through an in-depth analysis of the corpus. As soon as the copies of the SOCAs are secured from the Office of the City Mayor, they are then analyzed using documentary technique categorizing the types of reference as to anaphora and cataphora, counting each of these types and describing the implications of the dominant types used in the addresses. After a thorough analysis, it is found out that the two reference types namely, anaphora and cataphora are both employed on the three SOCAs, the former being used more frequently than the latter accounting to 80% and 20% of actual usage, respectively. Moreover, the use of anaphors and cataphora on the three addresses helps in conveying the message clearly because they primarily become aids to avoid the repetition of the same element in the text especially when there wasn’t a need to emphasize a point. Finally, it is recommended that writers of State of the City Addresses should have a vast knowledge on how reference should be used and the functions they take in the text since this is a vital tool to clearly transmit a message. Moreover, English teachers should explicitly teach the proper usage of anaphora and cataphora, as instruments to develop cohesion in written discourse, to enable students to write not only with sense but also with fluidity in tying utterances together.Keywords: anaphora, cataphora, reference, State of the City Address
Procedia PDF Downloads 1921371 Development of Fluorescence Resonance Energy Transfer-Based Nanosensor for Measurement of Sialic Acid in vivo
Authors: Ruphi Naz, Altaf Ahmad, Mohammad Anis
Abstract:
Sialic acid (5-Acetylneuraminic acid, Neu5Ac) is a common sugar found as a terminal residue on glycoconjugates in many animals. Humans brain and the central nervous system contain the highest concentration of sialic acid (as N-acetylneuraminic acid) where these acids play an important role in neural transmission and ganglioside structure in synaptogenesis. Due to its important biological function, sialic acid is attracting increasing attention. To understand metabolic networks, fluxes and regulation, it is essential to be able to determine the cellular and subcellular levels of metabolites. Genetically-encoded fluorescence resonance energy transfer (FRET) sensors represent a promising technology for measuring metabolite levels and corresponding rate changes in live cells. Taking this, we developed a genetically encoded FRET (fluorescence resonance energy transfer) based nanosensor to analyse the sialic acid level in living cells. Sialic acid periplasmic binding protein (sia P) from Haemophilus influenzae was taken and ligated between the FRET pair, the cyan fluorescent protein (eCFP) and Venus. The chimeric sensor protein was expressed in E. coli BL21 (DE3) and purified by affinity chromatography. Conformational changes in the binding protein clearly confirmed the changes in FRET efficiency. So any change in the concentration of sialic acid is associated with the change in FRET ratio. This sensor is very specific to sialic acid and found stable with the different range of pH. This nanosensor successfully reported the intracellular level of sialic acid in bacterial cell. The data suggest that the nanosensors may be a versatile tool for studying the in vivo dynamics of sialic acid level non-invasively in living cellsKeywords: nanosensor, FRET, Haemophilus influenzae, metabolic networks
Procedia PDF Downloads 1321370 Developing a DNN Model for the Production of Biogas From a Hybrid BO-TPE System in an Anaerobic Wastewater Treatment Plant
Authors: Hadjer Sadoune, Liza Lamini, Scherazade Krim, Amel Djouadi, Rachida Rihani
Abstract:
Deep neural networks are highly regarded for their accuracy in predicting intricate fermentation processes. Their ability to learn from a large amount of datasets through artificial intelligence makes them particularly effective models. The primary obstacle in improving the performance of these models is to carefully choose the suitable hyperparameters, including the neural network architecture (number of hidden layers and hidden units), activation function, optimizer, learning rate, and other relevant factors. This study predicts biogas production from real wastewater treatment plant data using a sophisticated approach: hybrid Bayesian optimization with a tree-structured Parzen estimator (BO-TPE) for an optimised deep neural network (DNN) model. The plant utilizes an Upflow Anaerobic Sludge Blanket (UASB) digester that treats industrial wastewater from soft drinks and breweries. The digester has a working volume of 1574 m3 and a total volume of 1914 m3. Its internal diameter and height were 19 and 7.14 m, respectively. The data preprocessing was conducted with meticulous attention to preserving data quality while avoiding data reduction. Three normalization techniques were applied to the pre-processed data (MinMaxScaler, RobustScaler and StandardScaler) and compared with the Non-Normalized data. The RobustScaler approach has strong predictive ability for estimating the volume of biogas produced. The highest predicted biogas volume was 2236.105 Nm³/d, with coefficient of determination (R2), mean absolute error (MAE), and root mean square error (RMSE) values of 0.712, 164.610, and 223.429, respectively.Keywords: anaerobic digestion, biogas production, deep neural network, hybrid bo-tpe, hyperparameters tuning
Procedia PDF Downloads 381369 Modified Weibull Approach for Bridge Deterioration Modelling
Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight
Abstract:
State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models
Procedia PDF Downloads 7271368 Combustion Characteristic of Propane/Acetylene Fuel Blends Pool Fire
Authors: Yubo Bi, Xiao Chen, Shouxiang Lu
Abstract:
A kind of gas-fueled burner, named Burning Rate Emulator, was proposed for the purpose of the emulation of condensed fuel recently. The gaseous fuel can be pure combustible fuel gas or blends of gaseous fuel or inert gas. However, this concept was recently proposed without detailed study on the combustion characteristic of fuel blends. In this study, two kinds of common gaseous fuels were selected, propane and acetylene, to provide the combustion heat as well as a large amount of smoke, which widely exists in liquid and solid fuel burning process. A set of experiments were carried out using a gas-fueled burner with a diameter of 8 cm. The total volume flow rate of propane and acetylene was kept at 3 liters per minute. The volume fraction of propane varied from 0% to 100% at interval of 10%. It is found that the flame height increases with propane volume fraction, which may be caused by the increase of heat release rate, as the energy density of propane is larger than that of acetylene. The dimensionless flame height is correlated against dimensionless heat release rate, which shows a power function relationship. The radiation fraction of the flame does not show a monotonic relationship with propane volume fraction. With the increase of propane volume fraction from 0% to 100%, the value of radiation fraction increases first and reach a maximum value around 0.46 at a propane volume fraction of 10%, and then decreases continuously to a value of 0.25 at the propane volume fraction of 100%. The flame radiation is related to the soot in the flame. The trend of the radiation fraction reflects that there may be a synergistic effect of soot formation between propane and acetylene which can be guessed from the significantly high radiation fraction at a propane volume fraction of 10%. This work provides data for combustion of gaseous fuel blends pool fire and also give reference on the design of Burning Rate Emulator.Keywords: Burning Rate Emulator, fuel blends pool fire, flame height, radiation fraction
Procedia PDF Downloads 2281367 Petrology, Geochemistry and Formation Conditions of Metaophiolites of the Loki Crystalline Massif (the Caucasus)
Authors: Irakli Gamkrelidze, David Shengelia, Tamara Tsutsunava, Giorgi Chichinadze, Giorgi Beridze, Ketevan Tedliashvili, Tamara Tsamalashvili
Abstract:
The Loki crystalline massif crops out in the Caucasian region and the geological retrospective represent the northern marginal part of the Baiburt-Sevanian terrain (island arc), bordering with the Paleotethys oceanic basin in the north. The pre-Alpine basement of the massif is built up of Lower-Middle Paleozoic metamorphic complex (metasedimentary and metabasite rocks), Upper Devonian quartz-diorites and Late Variscan granites. Earlier metamorphic complex was considered as an indivisible set including suites with different degree of metamorphism. Systematic geologic, petrologic and geochemical investigations of the massif’s rocks suggest the different conception on composition, structure and formation conditions of the massif. In particular, there are two main rock types in the Loki massif: the oldest autochthonous series of gneissic quartz-diorites and cutting them granites. The massif is flanked on its western side by a volcano-sedimentary sequence, metamorphosed to low-T facies. Petrologic, metamorphic and structural differences in this sequence prove the existence of a number of discrete units (overthrust sheets). One of them, the metabasic sheet represents the fragment of ophiolite complex. It comprises transition types of the second and third layers of the Paleooceanic crust: the upper noncumulated part of the third layer gabbro component and the following lowest part of the parallel diabase dykes of the second layer. The ophiolites are represented by metagabbros, metagabbro-diabases, metadiabases and amphibolite schists. According to the content of petrogenic components and additive elements in metabasites is stated that the protolith of metabasites belongs to petrochemical type of tholeiitic series of basalts. The parental magma of metaophiolites is of E-MORB composition, and by petrochemical parameters, it is very close to the composition of intraplate basalts. The dykes of hypabissal leucocratic siliceous and medium magmatic rocks associated with the metaophiolite sheet form the separate complex. They are granitoids with the extremely low content of CaO and quartz-diorite porphyries. According to various petrochemical parameters, these rocks have mixed characteristics. Their formation took place in spreading conditions or in the areas of manifestation of plumes most likely of island arc type. The metamorphism degree of the metaophiolites corresponds to a very low stage of green schist facies. The rocks of the metaophiolite complex are obducted from the Paleotethys Ocean. Geological and paleomagnetic data show that the primary location of the ocean is supposed to be to the north of the Loki crystalline massif.Keywords: the Caucasus, crystalline massif, ophiolites, tectonic sheet
Procedia PDF Downloads 2741366 Narrative Constructs and Environmental Engagement: A Textual Analysis of Climate Fiction’s Role in Shaping Sustainability Consciousness
Authors: Dean J. Hill
Abstract:
This paper undertakes the task of conducting an in-depth textual analysis of the cli-fi genre. It examines how writing in the genre contributes to expressing and facilitating the articulation of environmental consciousness through the form of narrative. The paper begins by situating cli-fi within the literary continuum of ecological narratives and identifying the unique textual characteristics and thematic preoccupations of this area. The paper unfolds how cli-fi transforms the esoteric nature of climate science into credible narrative forms by drawing on language use, metaphorical constructs, and narrative framing. It also involves how descriptive and figurative language in the description of nature and disaster makes climate change so vivid and emotionally resonant. The work also points out the dialogic nature of cli-fi, whereby the characters and the narrators experience inner disputes in the novel regarding the ethical dilemma of environmental destruction, thus demanding the readers challenge and re-evaluate their standpoints on sustainability and ecological responsibilities. The paper proceeds with analysing the feature of narrative voice and its role in eliciting empathy, as well as reader involvement with the ecological material. In looking at how different narratorial perspectives contribute to the emotional and cognitive reaction of the reader to text, this study demonstrates the profound power of perspective in developing intimacy with the dominating concerns. Finally, the emotional arc of cli-fi narratives, running its course over themes of loss, hope, and resilience, is analysed in relation to how these elements function to marshal public feeling and discourse into action around climate change. Therefore, we can say that the complexity of the text in the cli-fi not only shows the hard edge of the reality of climate change but also influences public perception and behaviour toward a more sustainable future.Keywords: cli-fi genre, ecological narratives, emotional arc, narrative voice, public perception
Procedia PDF Downloads 311365 Using Convolutional Neural Networks to Distinguish Different Sign Language Alphanumerics
Authors: Stephen L. Green, Alexander N. Gorban, Ivan Y. Tyukin
Abstract:
Within the past decade, using Convolutional Neural Networks (CNN)’s to create Deep Learning systems capable of translating Sign Language into text has been a breakthrough in breaking the communication barrier for deaf-mute people. Conventional research on this subject has been concerned with training the network to recognize the fingerspelling gestures of a given language and produce their corresponding alphanumerics. One of the problems with the current developing technology is that images are scarce, with little variations in the gestures being presented to the recognition program, often skewed towards single skin tones and hand sizes that makes a percentage of the population’s fingerspelling harder to detect. Along with this, current gesture detection programs are only trained on one finger spelling language despite there being one hundred and forty-two known variants so far. All of this presents a limitation for traditional exploitation for the state of current technologies such as CNN’s, due to their large number of required parameters. This work aims to present a technology that aims to resolve this issue by combining a pretrained legacy AI system for a generic object recognition task with a corrector method to uptrain the legacy network. This is a computationally efficient procedure that does not require large volumes of data even when covering a broad range of sign languages such as American Sign Language, British Sign Language and Chinese Sign Language (Pinyin). Implementing recent results on method concentration, namely the stochastic separation theorem, an AI system is supposed as an operate mapping an input present in the set of images u ∈ U to an output that exists in a set of predicted class labels q ∈ Q of the alphanumeric that q represents and the language it comes from. These inputs and outputs, along with the interval variables z ∈ Z represent the system’s current state which implies a mapping that assigns an element x ∈ ℝⁿ to the triple (u, z, q). As all xi are i.i.d vectors drawn from a product mean distribution, over a period of time the AI generates a large set of measurements xi called S that are grouped into two categories: the correct predictions M and the incorrect predictions Y. Once the network has made its predictions, a corrector can then be applied through centering S and Y by subtracting their means. The data is then regularized by applying the Kaiser rule to the resulting eigenmatrix and then whitened before being split into pairwise, positively correlated clusters. Each of these clusters produces a unique hyperplane and if any element x falls outside the region bounded by these lines then it is reported as an error. As a result of this methodology, a self-correcting recognition process is created that can identify fingerspelling from a variety of sign language and successfully identify the corresponding alphanumeric and what language the gesture originates from which no other neural network has been able to replicate.Keywords: convolutional neural networks, deep learning, shallow correctors, sign language
Procedia PDF Downloads 1001364 Bahrain Experience in Supporting Small and Medium Enterprises by the Utilization of E-Government
Authors: Najla Alhkalaf
Abstract:
The focus of this study is answering the following question: How do e-government services in Bahrain support the productivity of SMEs? This study examines the current E-government function in enhancing SME productivity in Bahrain through analysing the efficiency of e- government by viewing its facilitators and barriers from the perspective of different stakeholders. The study aims to identify and develop best practice guidelines with the end-goal of creating a standardised channel of communication between e-government and SMEs that fulfil the requirement of SME owners, and thus achieve the prime objective of e-government. E-government services for SMEs have been offered in Bahrain since 2005. However, the current services lack the required mechanism for SMEs to fully take advantage of these services because of lagging communication between service provider and end-user. E-government employees believe that a lack of awareness and trust are the main stumbling block, whereas the SME owners believe that there is a lack of sufficiency in the content and efficiency provided through e- services. A questionnaire has been created based on a pilot study that highlighted the main indicators of e-government efficiency and SMEs productivity as well as previous studies conducted on this subject. This allowed for quantitative data to be extracted. Also interviews were conducted with SME owners and government employees from both case studies, which formed the qualitative data for this study. The findings portray that both the service provider and service receiver largely agree on the existence of most of the technical and administrative barriers. However, the data reflects a level of dissatisfaction from the SME side, which contradicts with the perceived level of satisfaction from the government employees. Therefore, the data supports the argument that assures the existence of a communication gap between stakeholders. To this effect, this research would help build channels of communication between stakeholders, and then induces a plan unlocking the potential of e-government application. The conclusions of this study will help devise an optimised E-government strategy for Bahrain.Keywords: e-government, SME, e-services, G2B, government employees' perspective, entrepreneurs' perspective, enterprise
Procedia PDF Downloads 2311363 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions
Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes
Abstract:
The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning
Procedia PDF Downloads 721362 Young Female’s Heart Was Bitten by Unknown Ghost (Isolated Cardiac Sarcoidosis): A Case Report
Authors: Heru Al Amin
Abstract:
Sarcoidosis is a granulomatous inflammatory disorder of unclear etiology that can affect multiple different organ systems. Isolated cardiac sarcoidosis is a very rare condition that causes lethal arrhythmia and heart failure. A definite diagnosis of cardiac sarcoidosis remains challenging. The use of multimodality imaging plays a pivotal role in the diagnosis of this entity. Case summary: In this report, we discuss a case of a 50-year-old woman who presented with recurrent palpitation, dizziness, vertigo and presyncope. Electrocardiogram revealed variable heart blocks, including first-degree AV block, second-degree AV block, high-degree AV block, complete AV block, trifascicular block and sometimes supraventricular arrhythmia. Twenty-four hours of Holter monitoring show atrial bigeminy, first-degree AV block and trifascicular block. Transthoracic echocardiography showed Thinning of basal anteroseptal and inferred septum with LV dilatation with reduction of Global Longitudinal Strain. A dual-chamber pacemaker was implanted. CT Coronary angiogram showed no coronary artery disease. Cardiac magnetic resonance revealed basal anteroseptal and inferior septum thinning with focal edema with LGE suggestive of sarcoidosis. Computed tomography of the chest showed no lymphadenopathy or pulmonary infiltration. 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) of the whole body showed. We started steroids and followed up with the patient. Conclusion: This case serves to highlight the challenges in identifying and managing isolated CS in a young patient with recurrent syncope with variable heart block. Early, even late initiation of steroids can improve arrhythmia as well as left ventricular function.Keywords: cardiac sarcoidosis, conduction abnormality, syncope, cardiac MRI
Procedia PDF Downloads 911361 Legal Allocation of Risks: A Computational Analysis of Force Majeure Clauses
Authors: Farshad Ghodoosi
Abstract:
This article analyzes the effect of supervening events in contracts. Contracts serve an important function: allocation of risks. In spite of its importance, the case law and the doctrine are messy and inconsistent. This article provides a fresh look at excuse doctrines (i.e., force majeure, impracticability, impossibility, and frustration) with a focus on force majeure clauses. The article makes the following contributions: First, it furnishes a new conceptual and theoretical framework of excuse doctrines. By distilling the decisions, it shows that excuse doctrines rests on the triangle of control, foreseeability, and contract language. Second, it analyzes force majeure clauses used by S&P 500 companies to understand the stickiness and similarity of such clauses and the events they cover. Third, using computational and statistical tools, it analyzes US cases since 1810 in order to assess the weight given to the triangle of control, foreseeability, and contract language. It shows that the control factor plays an important role in force majeure analysis, while the contractual interpretation is the least important factor. The Article concludes that it is the standard for control -whether the supervening event is beyond the control of the party- that determines the outcome of cases in the force majeure context and not necessarily the contractual language. This article has important implications on COVID-19-related contractual cases. Unlike the prevailing narrative that it is the language of the force majeure clause that’s determinative, this article shows that the primarily focus of the inquiry will be on whether the effects of COVID-19 have been beyond the control of the promisee. Normatively, the Article suggests that the trifactor of control, foreseeability, and contractual language are not effective for allocation of legal risks in times of crises. It puts forward a novel approach to force majeure clauses whereby that the courts should instead focus on the degree to which parties have relied on (expected) performance, in particular during the time of crisis.Keywords: contractual risks, force majeure clauses, foreseeability, control, contractual language, computational analysis
Procedia PDF Downloads 1491360 The Soundscape of Contemporary Buddhist Music in Taiwan: Tzu Chi Vesak Ceremony
Authors: Sylvia Huang
Abstract:
Contemporary Buddhist music has been emerged at the new forms of large-scale public Buddhist ritual ceremonies that may involve up to 10,000 participants at a time. Since 2007, the Buddha’s Birthday ceremony (Sanskrit, Vesak) by the Buddhist Tzu Chi Foundation has being held at major cities in Taiwan and many affiliated Tzu Chi offices around the world. Analysis of this modern and technologically-dependent ceremony sheds new light on the significance of music in contemporary Buddhist ritual, and also on recently enhanced and increasingly intimate connections between music and Buddhism. Through extensive ethnographic research of ten years (2007-2017), the research explores how the form of contemporary Buddhist music relates to the role of music in participants’ experience of the ritual and the way in which they construct meaning. The theoretical approach draws on both ethnomusicology and Buddhist teachings, Dharma. As soundscape is defined as the entire sonic energy produced by a landscape, the concept of soundscape is utilised to examine the contemporary ritual music in the Tzu Chi Vesak ceremony. The analysis opens new territory in exploring how analysis of Buddhist music can benefit from incorporating Buddhist philosophy within the methodological approach. Main findings are: 1) music becomes a method for Buddhist understanding through a focus in particular on how the ceremonial program is followed by music, and 2) participants engage with each other and entrain with music in the Vesak ceremony. As Buddhist sounding, such as scripture reading, liturgical chanting, and ceremonial music singing, is a sonic epistemological knowing of the conditions in which Buddhism is practiced, experienced, and transmigrated, the research concludes by showing that studies of Buddhist music have the potential to reveal distinctively Buddhist concepts, meaning, and values. Certain principles of Buddhist philosophy are adopted within ethnomusicological analysis to further enhance understandings of the crucial function of music within such a ritual context. Finally, the contemporary Buddhist music performance in the ceremony is possessed as a means of direct access to the spiritual experience in Buddhism.Keywords: buddhist music, Taiwan, soundscape, Vesak ceremony
Procedia PDF Downloads 137