Search results for: basic science curriculum
999 Experimental Research on the Effect of Activating Temperature on Combustion and Nox Emission Characteristics of Pulverized Coal in a Novel Purification-combustion Reaction System
Authors: Ziqu Ouyang, Kun Su
Abstract:
A novel efficient and clean coal combustion system, namely the purification-combustion system, was designed by the Institute of Engineering Thermal Physics, Chinese Academy of Science, in 2022. Among them, the purification system was composed of a mesothermal activating unit and a hyperthermal reductive unit, and the combustion system was composed of a mild combustion system. In the purification-combustion system, the deep in-situ removal of coal-N could be realized by matching the temperature and atmosphere in each unit, and thus the NOx emission was controlled effectively. To acquire the methods for realizing the efficient and clean coal combustion, this study investigated the effect of the activating temperature (including 822 °C, 858 °C, 933 °C, 991 °C), which was the key factor affecting the system operation, on combustion and NOx emission characteristics of pulverized coal in a 30 kW purification-combustion test bench. The research result turned out that the activating temperature affected the combustion and NOx emission characteristics significantly. As the activating temperature increased, the temperature increased first and then decreased in the mild combustion unit, and the temperature change in the lower part was much higher than that in the upper part. Moreover, the main combustion region was always located at the top of the unit under different activating temperatures, and the combustion intensity along the unit was weakened gradually. Increasing the activating temperature excessively could destroy the reductive atmosphere early in the upper part of the unit, which wasn’t conducive to the full removal of coal-N in the reductive coal char. As the activating temperature increased, the combustion efficiency increased first and then decreased, while the NOx emission decreased first and then increased, illustrating that increasing the activating temperature properly promoted the efficient and clean coal combustion, but there was a limit to its growth. In this study, the optimal activating temperature was 858 °C. Hence, this research illustrated that increasing the activating temperature properly could realize the mutual matching of improving the combustion efficiency and reducing the NOx emission, and thus guaranteed the clean and efficient coal combustion well.Keywords: activating temperature, combustion characteristics, nox emission, purification-combustion system
Procedia PDF Downloads 89998 Systematic Review of Associations between Interoception, Vagal Tone, and Emotional Regulation
Authors: Darren Edwards, Thomas Pinna
Abstract:
Background: Interoception and heart rate variability have been found to predict outcomes of mental health and well-being. However, these have usually been investigated independently of one another. Objectives: This review aimed to explore the associations between interoception and heart rate variability (HRV) with emotion regulation (ER) and ER strategies within the existing literature and utilizing systematic review methodology. Methods: The process of article retrieval and selection followed the preferred reporting items for systematic review and meta-analyses (PRISMA) guidelines. Databases PsychINFO, Web of Science, PubMed, CINAHL, and MEDLINE were scanned for papers published. Preliminary inclusion and exclusion criteria were specified following the patient, intervention, comparison, and outcome (PICO) framework, whilst the checklist for critical appraisal and data extraction for systematic reviews of prediction modeling studies (CHARMS) framework was used to help formulate the research question, and to critically assess for bias in the identified full-length articles. Results: 237 studies were identified after initial database searches. Of these, eight studies were included in the final selection. Six studies explored the associations between HRV and ER, whilst three investigated the associations between interoception and ER (one of which was included in the HRV selection too). Overall, the results seem to show that greater HRV and interoception are associated with better ER. Specifically, high parasympathetic activity largely predicted the use of adaptive ER strategies such as reappraisal, and better acceptance of emotions. High interoception, instead, was predictive of effective down-regulation of negative emotions and handling of social uncertainty, there was no association with any specific ER strategy. Conclusions: Awareness of one’s own bodily feelings and vagal activation seem to be of central importance for the effective regulation of emotional responses.Keywords: emotional regulation, vagal tone, interoception, chronic conditions, health and well-being, psychological flexibility
Procedia PDF Downloads 112997 Cognitive Behaviour Drama: A Research-Based Intervention Model to Improve Social Thinking in High-Functioning Children with Autism
Authors: Haris Karnezi, Kevin Tierney
Abstract:
Cognitive Behaviour Drama is a research-based intervention model that brought together the science of psychology with the art form of drama to create an unobtrusive and exciting approach that would provide children on the higher end of the autism spectrum the motivation to explore the rules of social interaction and develop competencies associated with communicative success. The method involves engaging the participants in exciting fictional scenarios and encouraging them to seek various solutions on a number of problems that will lead them to an understanding of causal relationships and how a different course of action may lead to a different outcome. The sessions are structured to offer opportunities to the participants to practice target behaviours and understand the functions they serve. The study involved six separate interventions and employed both single case and group designs. Overall 8 children aged between 6 to 13 years, diagnosed with ASD participated in the study. Outcomes were measured using theory of mind tests, executive functioning tests, behavioural observations, pre and post intervention standardised social competence questionnaires for parents and teachers. Collectively, the results indicated positive changes in the self esteem and behaviour of all eight participants. In particular, improvements in the ability to solve theory of mind tasks were noted in the younger group; and qualitative improvements in social communication, in terms of verbal (content) and non verbal expression (body posture, vocal expression, fluency, eye contact, reduction of ritualistic mannerisms) were noted in the older group. The need for reliable impact measures to assess the effectiveness of the model in generating global changes in the participants’ behaviour outside the therapeutic context was identified.Keywords: autism, drama, intervention, social skills
Procedia PDF Downloads 159996 Mentoring of Health Professionals to Ensure Better Child-Birth and Newborn Care in Bihar, India: An Intervention Study
Authors: Aboli Gore, Aritra Das, Sunil Sonthalia, Tanmay Mahapatra, Sridhar Srikantiah, Hemant Shah
Abstract:
AMANAT is an initiative, taken in collaboration with the Government of Bihar, aimed at improving the Quality of Maternal and Neonatal care services at Bihar’s public health facilities – those offering either the Basic Emergency Obstetric and Neonatal care (BEmONC) or Comprehensive Emergency Obstetric and Neonatal care (CEmONC) services. The effectiveness of this program is evaluated by conducting cross-sectional assessments at the concerned facilities prior to (baseline) and following completion (endline) of intervention. Direct Observation of Delivery (DOD) methodology is employed for carrying out the baseline and endline assessments – through which key obstetric and neonatal care practices among the Health Care Providers (especially the nurses) are assessed quantitatively by specially trained nursing professionals. Assessment of vitals prior to delivery improved during all three phases of BEmONC and all four phases of CEmONC training with statistically significant improvement noted in: i) pulse measurement in BEmONC phase 2 (9% to 68%), 3 (4% to 57%) & 4 (14% to 59%) and CEmONC phase 2 (7% to 72%) and 3 (0% to 64%); ii) blood pressure measurement in BEmONC phase 2 (27% to 84%), 3 (21% to 76%) & 4 (36% to 71%) and CEmONC phase 2 (23% to 76%) and 3 (2% to 70%); iii) fetal heart rate measurement in BEmONC phase 2 (10% to 72%), 3 (11% to 77%) & 4 (13% to 64%) and CEmONC phase 1 (24% to 38%), 2 (14% to 82%) and 3 (1% to 73%); and iv) abdominal examination in BEmONC phase 2 (14% to 59%), 3 (3% to 59%) & 4 (6% to 56%) and CEmONC phase 1 (0% to 24%), 2 (7% to 62%) & 3 (0% to 62%). Regarding infection control, wearing of apron, mask and cap by the delivery conductors improved significantly in all BEmONC phases. Similarly, the practice of handwashing improved in all BEmONC and CEmONC phases. Even on disaggregation, the handwashing showed significant improvement in all phases but CEmONC phase-4. Not only the positive practices related to handwashing improved but also negative practices such as turning off the tap with bare hands declined significantly in the aforementioned phases. Significant decline was also noted in negative maternal care practices such as application of fundal pressure for hastening the delivery process and administration of oxytocin prior to delivery. One of the notable achievement of AMANAT is an improvement in active management of the third stage of labor (AMTSL). The overall AMTSL (including administration of oxytocin or other uterotonics uterotonic in proper dose, route and time along with controlled cord traction and uterine massage) improved in all phases of BEmONC and CEmONC mentoring. Another key area of improvement, across phases, was in proper cutting/clamping of the umbilical cord. AMANAT mentoring also led to improvement in important immediate newborn care practices such as initiation of skin-to-skin care and timely initiation of breastfeeding. The next phase of the mentoring program seeks to institutionalize mentoring across the state that could potentially perpetuate improvement with minimal external intervention.Keywords: capacity building, nurse-mentoring, quality of care, pregnancy, newborn care
Procedia PDF Downloads 161995 A Comprehensive Key Performance Indicators Dashboard for Emergency Medical Services
Authors: Giada Feletti, Daniela Tedesco, Paolo Trucco
Abstract:
The present study aims to develop a dashboard of Key Performance Indicators (KPI) to enhance information and predictive capabilities in Emergency Medical Services (EMS) systems, supporting both operational and strategic decisions of different actors. The employed research methodology consists of the first phase of revision of the technical-scientific literature concerning the indicators currently used for the performance measurement of EMS systems. From this literature analysis, it emerged that current studies focus on two distinct perspectives: the ambulance service, a fundamental component of pre-hospital health treatment, and the patient care in the Emergency Department (ED). The perspective proposed by this study is to consider an integrated view of the ambulance service process and the ED process, both essential to ensure high quality of care and patient safety. Thus, the proposal focuses on the entire healthcare service process and, as such, allows considering the interconnection between the two EMS processes, the pre-hospital and hospital ones, connected by the assignment of the patient to a specific ED. In this way, it is possible to optimize the entire patient management. Therefore, attention is paid to the dependency of decisions that in current EMS management models tend to be neglected or underestimated. In particular, the integration of the two processes enables the evaluation of the advantage of an ED selection decision having visibility on EDs’ saturation status and therefore considering the distance, the available resources and the expected waiting times. Starting from a critical review of the KPIs proposed in the extant literature, the design of the dashboard was carried out: the high number of analyzed KPIs was reduced by eliminating the ones firstly not in line with the aim of the study and then the ones supporting a similar functionality. The KPIs finally selected were tested on a realistic dataset, which draws us to exclude additional indicators due to the unavailability of data required for their computation. The final dashboard, which was discussed and validated by experts in the field, includes a variety of KPIs able to support operational and planning decisions, early warning, and citizens’ awareness of EDs accessibility in real-time. By associating each KPI to the EMS phase it refers to, it was also possible to design a well-balanced dashboard covering both efficiency and effective performance of the entire EMS process. Indeed, just the initial phases related to the interconnection between ambulance service and patient’s care are covered by traditional KPIs compared to the subsequent phases taking place in the hospital ED. This could be taken into consideration for the potential future development of the dashboard. Moreover, the research could proceed by building a multi-layer dashboard composed of the first level with a minimal set of KPIs to measure the basic performance of the EMS system at an aggregate level and further levels with KPIs that can bring additional and more detailed information.Keywords: dashboard, decision support, emergency medical services, key performance indicators
Procedia PDF Downloads 112994 An AI-generated Semantic Communication Platform in HCI Course
Authors: Yi Yang, Jiasong Sun
Abstract:
Almost every aspect of our daily lives is now intertwined with some degree of human-computer interaction (HCI). HCI courses draw on knowledge from disciplines as diverse as computer science, psychology, design principles, anthropology, and more. Our HCI courses, named the Media and Cognition course, are constantly updated to reflect state-of-the-art technological advancements such as virtual reality, augmented reality, and artificial intelligence-based interactions. For more than a decade, our course has used an interest-based approach to teaching, in which students proactively propose some research-based questions and collaborate with teachers, using course knowledge to explore potential solutions. Semantic communication plays a key role in facilitating understanding and interaction between users and computer systems, ultimately enhancing system usability and user experience. The advancements in AI-generated technology, which have gained significant attention from both academia and industry in recent years, are exemplified by language models like GPT-3 that generate human-like dialogues from given prompts. Our latest version of the Human-Computer Interaction course practices a semantic communication platform based on AI-generated techniques. The purpose of this semantic communication is twofold: to extract and transmit task-specific information while ensuring efficient end-to-end communication with minimal latency. An AI-generated semantic communication platform evaluates the retention of signal sources and converts low-retain ability visual signals into textual prompts. These data are transmitted through AI-generated techniques and reconstructed at the receiving end; on the other hand, visual signals with a high retain ability rate are compressed and transmitted according to their respective regions. The platform and associated research are a testament to our students' growing ability to independently investigate state-of-the-art technologies.Keywords: human-computer interaction, media and cognition course, semantic communication, retainability, prompts
Procedia PDF Downloads 115993 Narrative Constructs and Environmental Engagement: A Textual Analysis of Climate Fiction’s Role in Shaping Sustainability Consciousness
Authors: Dean J. Hill
Abstract:
This paper undertakes the task of conducting an in-depth textual analysis of the cli-fi genre. It examines how writing in the genre contributes to expressing and facilitating the articulation of environmental consciousness through the form of narrative. The paper begins by situating cli-fi within the literary continuum of ecological narratives and identifying the unique textual characteristics and thematic preoccupations of this area. The paper unfolds how cli-fi transforms the esoteric nature of climate science into credible narrative forms by drawing on language use, metaphorical constructs, and narrative framing. It also involves how descriptive and figurative language in the description of nature and disaster makes climate change so vivid and emotionally resonant. The work also points out the dialogic nature of cli-fi, whereby the characters and the narrators experience inner disputes in the novel regarding the ethical dilemma of environmental destruction, thus demanding the readers challenge and re-evaluate their standpoints on sustainability and ecological responsibilities. The paper proceeds with analysing the feature of narrative voice and its role in eliciting empathy, as well as reader involvement with the ecological material. In looking at how different narratorial perspectives contribute to the emotional and cognitive reaction of the reader to text, this study demonstrates the profound power of perspective in developing intimacy with the dominating concerns. Finally, the emotional arc of cli-fi narratives, running its course over themes of loss, hope, and resilience, is analysed in relation to how these elements function to marshal public feeling and discourse into action around climate change. Therefore, we can say that the complexity of the text in the cli-fi not only shows the hard edge of the reality of climate change but also influences public perception and behaviour toward a more sustainable future.Keywords: cli-fi genre, ecological narratives, emotional arc, narrative voice, public perception
Procedia PDF Downloads 31992 Rheological Properties of Polymer Systems in Magnetic Field
Authors: T. S. Soliman, A. G. Galyas, E. V. Rusinova, S. A. Vshivkov
Abstract:
The liquid crystals combining properties of a liquid and an anisotropic crystal substance play an important role in a science and engineering. Molecules of cellulose and its derivatives have rigid helical conformation, stabilized by intramolecular hydrogen bonds. Therefore the macromolecules of these polymers are capable to be ordered at dissolution and form liquid crystals of cholesteric type. Phase diagrams of solutions of some cellulose derivatives are known. However, little is known about the effect of a magnetic field on the viscosity of polymer solutions. The systems hydroxypropyl cellulose (HPC) – ethanol, HPC – ethylene glycol, HPC–DМАA, HPC–DMF, ethyl cellulose (EC)–ethanol, EC–DMF, were studied in the presence and absence of magnetic field. The solution viscosity was determined on a Rheotest RN 4.1 rheometer. The effect of a magnetic field on the solution properties was studied with the use of two magnets, which induces a magnetic-field-lines directed perpendicularly and parallel to the rotational axis of a rotor. Application of the magnetic field is shown to be accompanied by an increase in the additional assembly of macromolecules, as is evident from a gain in the radii of light scattering particles. In the presence of a magnetic field, the long chains of macromolecules are oriented in parallel with field lines. Such an orientation is associated with the molecular diamagnetic anisotropy of macromolecules. As a result, supramolecular particles are formed, especially in the vicinity of the region of liquid crystalline phase transition. The magnetic field leads to the increase in viscosity of solutions. The results were used to plot the concentration dependence of η/η0, where η and η0 are the viscosities of solutions in the presence and absence of a magnetic field, respectively. In this case, the values of viscosity corresponding to low shear rates were chosen because the concentration dependence of viscosity at low shear rates is typical for anisotropic systems. In the investigated composition range, the values of η/η0 are described by a curve with a maximum.Keywords: rheology, liquid crystals, magnetic field, cellulose ethers
Procedia PDF Downloads 348991 Cybersecurity for Digital Twins in the Built Environment: Research Landscape, Industry Attitudes and Future Direction
Authors: Kaznah Alshammari, Thomas Beach, Yacine Rezgui
Abstract:
Technological advances in the construction sector are helping to make smart cities a reality by means of cyber-physical systems (CPS). CPS integrate information and the physical world through the use of information communication technologies (ICT). An increasingly common goal in the built environment is to integrate building information models (BIM) with the Internet of Things (IoT) and sensor technologies using CPS. Future advances could see the adoption of digital twins, creating new opportunities for CPS using monitoring, simulation, and optimisation technologies. However, researchers often fail to fully consider the security implications. To date, it is not widely possible to assimilate BIM data and cybersecurity concepts, and, therefore, security has thus far been overlooked. This paper reviews the empirical literature concerning IoT applications in the built environment and discusses real-world applications of the IoT intended to enhance construction practices, people’s lives and bolster cybersecurity. Specifically, this research addresses two research questions: (a) how suitable are the current IoT and CPS security stacks to address the cybersecurity threats facing digital twins in the context of smart buildings and districts? and (b) what are the current obstacles to tackling cybersecurity threats to the built environment CPS? To answer these questions, this paper reviews the current state-of-the-art research concerning digital twins in the built environment, the IoT, BIM, urban cities, and cybersecurity. The results of these findings of this study confirmed the importance of using digital twins in both IoT and BIM. Also, eight reference zones across Europe have gained special recognition for their contributions to the advancement of IoT science. Therefore, this paper evaluates the use of digital twins in CPS to arrive at recommendations for expanding BIM specifications to facilitate IoT compliance, bolster cybersecurity and integrate digital twin and city standards in the smart cities of the future.Keywords: BIM, cybersecurity, digital twins, IoT, urban cities
Procedia PDF Downloads 169990 Implications of Social Rights Adjudication on the Separation of Powers Doctrine: Colombian Case
Authors: Mariam Begadze
Abstract:
Separation of Powers (SOP) has often been the most frequently posed objection against the judicial enforcement of socio-economic rights. Although a lot has been written to refute those, very rarely has it been assessed what effect the current practice of social rights adjudication has had on the construction of SOP doctrine in specific jurisdictions. Colombia is an appropriate case-study on this question. The notion of collaborative SOP in the 1991 Constitution has affected the court’s conception of its role. On the other hand, the trends in the jurisprudence have further shaped the collaborative notion of SOP. Other institutional characteristics of the Colombian constitutional law have played its share role as well. Tutela action, particularly flexible and fast judicial action for individuals has placed the judiciary in a more confrontational relation vis-à-vis the political branches. Later interventions through abstract review of austerity measures further contributed to that development. Logically, the court’s activism in this sphere has attracted attacks from political branches, which have turned out to be unsuccessful precisely due to court’s outreach to the middle-class, whose direct reliance on the court has turned into its direct democratic legitimacy. Only later have the structural judgments attempted to revive the collaborative notion behind SOP doctrine. However, the court-supervised monitoring process of implementation has itself manifested fluctuations in the mode of collaboration, moving into more managerial supervision recently. This is not surprising considering the highly dysfunctional political system in Colombia, where distrust seems to be the default starting point in the interaction of the branches. The paper aims to answer the question, what the appropriate judicial tools are to realize the collaborative notion of SOP in a context where the court has to strike a balance between the strong executive and the weak and largely dysfunctional legislative branch. If the recurrent abuse lies in the indifference and inaction of legislative branches to engage with political issues seriously, what are the tools in the court’s hands to activate the political process? The answer to this question partly lies in the court’s other strand of jurisprudence, in which it combines substantive objections with procedural ones concerning the operation of the legislative branch. The primary example is the decision on value-added tax on basic goods, in which the court invalidated the law based on the absence of sufficient deliberation in Congress on the question of the bills’ implications on the equity and progressiveness of the entire taxing system. The decision led to Congressional rejection of an identical bill based on the arguments put forward by the court. The case perhaps is the best illustration of the collaborative notion of SOP, in which the court refrains from categorical pronouncements, while does its bit for activating political process. This also legitimizes the court’s activism based on its role to counter the most perilous abuse in the Colombian context – failure of the political system to seriously engage with serious political questions.Keywords: Colombian constitutional court, judicial review, separation of powers, social rights
Procedia PDF Downloads 104989 Development of Functional Cosmetic Materials from Demilitarized Zone Habiting Plants
Authors: Younmin Shin, Jin Kyu Kim, Mirim Jin, Jeong June Choi
Abstract:
Demilitarized Zone (DMZ) is a peace region located between South and North Korea border to avoid accidental armed conflict. Because human accessing to the area was forced to be prohibited for more than 60 years, DMZ is one of the cleanest land keeping wild lives as nature itself in South Korea. In this study, we evaluated the biological efficacies of plants (SS, PC, and AR) inhabiting in DMZ for the development of functional cosmetics. First, we tested the cytotoxicity of plant extracts in keratinocyte and melanocyte, which are the major cell components of skin. By 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide (MTT) assay with the cell lines, we determined the safety concentrations of the extracts for the efficacy tests. Next, we assessed the anti-wrinkle cosmetic function of SS by demonstrating that SS treatment decreased the expression of Matrix metalloproteinase-1 (MMP-1) in UV-irradiated keratinocytes via real-time PCR. The suppressive effect of SS was greatly potentiated by combination with other DMZ-inhabiting plants, PC and AR. The expression of tyrosinase, which is one the main enzyme that producing melanin in melanocyte, was also down-regulated by the DMZ-inhabiting SS extract. Wound healing activity was also investigated by in vitro test with HaCat cell line, a human fibroblast cell line. All the natural materials extracted form DMZ habiting plants accelerated the recovery of the cells. These results suggested that DMZ is a treasure island of functional plants and DMZ-inhabiting natural products are warranted to develop functional cosmetic materials. This study was carried out with the support of R&D Program for Forest Science Technology (Project No. 2017027A00-1819-BA01) provided by Korea Forest Service (Korea Forestry Promotion Institute).Keywords: anti-wrinkle, Demilitarized Zone, functional cosmetics, whitening
Procedia PDF Downloads 144988 Production of Bacillus Lipopeptides for Biocontrol of Postharvest Crops
Authors: Vivek Rangarajan, Kim G. Klarke
Abstract:
With overpopulation threatening the world’s ability to feed itself, food production and protection has become a major issue, especially in developing countries. Almost one-third of the food produced for human consumption, around 1.3 billion tonnes, is either wasted or lost annually. Postharvest decay in particular constitutes a major cause of crop loss with about 20% of fruits and vegetables produced lost during postharvest storage, mainly due to fungal disease. Some of the major phytopathogenic fungi affecting postharvest fruit crops in South Africa include Aspergillus, Botrytis, Penicillium, Alternaria and Sclerotinia spp. To date control of fungal phytopathogens has primarily been dependent on synthetic chemical fungicides, but these chemicals pose a significant threat to the environment, mainly due to their xenobiotic properties and tendency to generate resistance in the phytopathogens. Here, an environmentally benign alternative approach to control postharvest fungal phytopathogens in perishable fruit crops has been presented, namely the application of a bio-fungicide in the form of lipopeptide molecules. Lipopeptides are biosurfactants produced by Bacillus spp. which have been established as green, nontoxic and biodegradable molecules with antimicrobial properties. However, since the Bacillus are capable of producing a large number of lipopeptide homologues with differing efficacies against distinct target organisms, the lipopeptide production conditions and strategy are critical to produce the maximum lipopeptide concentration with homologue ratios to specification for optimum bio-fungicide efficacy. Process conditions, and their impact on Bacillus lipopeptide production, were evaluated in fully instrumented laboratory scale bioreactors under well-regulated controlled and defined environments. Factors such as the oxygen availability and trace element and nitrate concentrations had profound influences on lipopeptide yield, productivity and selectivity. Lipopeptide yield and homologue selectivity were enhanced in cultures where the oxygen in the sparge gas was increased from 21 to 30 mole%. The addition of trace elements, particularly Fe2+, increased the total concentration of lipopeptides and a nitrate concentration equivalent to 8 g/L ammonium nitrate resulted in optimum lipopeptide yield and homologue selectivity. Efficacy studies of the culture supernatant containing the crude lipopeptide mixture were conducted using phytopathogens isolated from fruit in the field, identified using genetic sequencing. The supernatant exhibited antifungal activity against all the test-isolates, namely Lewia, Botrytis, Penicillium, Alternaria and Sclerotinia spp., even in this crude form. Thus the lipopeptide product efficacy has been confirmed to control the main diseases, even in the basic crude form. Future studies will be directed towards purification of the lipopeptide product and enhancement of efficacy.Keywords: antifungal efficacy, biocontrol, lipopeptide production, perishable crops
Procedia PDF Downloads 404987 Wood as a Climate Buffer in a Supermarket
Authors: Kristine Nore, Alexander Severnisen, Petter Arnestad, Dimitris Kraniotis, Roy Rossebø
Abstract:
Natural materials like wood, absorb and release moisture. Thus wood can buffer indoor climate. When used wisely, this buffer potential can be used to counteract the outer climate influence on the building. The mass of moisture used in the buffer is defined as the potential hygrothermal mass, which can be an energy storage in a building. This works like a natural heat pump, where the moisture is active in damping the diurnal changes. In Norway, the ability of wood as a material used for climate buffering is tested in several buildings with the extensive use of wood, including supermarkets. This paper defines the potential of hygrothermal mass in a supermarket building. This includes the chosen ventilation strategy, and how the climate impact of the building is reduced. The building is located above the arctic circle, 50m from the coastline, in Valnesfjord. It was built in 2015, has a shopping area, including toilet and entrance, of 975 m². The climate of the area is polar according to the Köppen classification, but the supermarket still needs cooling on hot summer days. In order to contribute to the total energy balance, wood needs dynamic influence to activate its hygrothermal mass. Drying and moistening of the wood are energy intensive, and this energy potential can be exploited. Examples are to use solar heat for drying instead of heating the indoor air, and raw air with high enthalpy that allow dry wooden surfaces to absorb moisture and release latent heat. Weather forecasts are used to define the need for future cooling or heating. Thus, the potential energy buffering of the wood can be optimized with intelligent ventilation control. The ventilation control in Valnesfjord includes the weather forecast and historical data. That is a five-day forecast and a two-day history. This is to prevent adjustments to smaller weather changes. The ventilation control has three zones. During summer, the moisture is retained to dampen for solar radiation through drying. In the winter time, moist air let into the shopping area to contribute to the heating. When letting the temperature down during the night, the moisture absorbed in the wood slow down the cooling. The ventilation system is shut down during closing hours of the supermarket in this period. During the autumn and spring, a regime of either storing the moisture or drying out to according to the weather prognoses is defined. To ensure indoor climate quality, measurements of CO₂ and VOC overrule the low energy control if needed. Verified simulations of the Valnesfjord building will build a basic model for investigating wood as a climate regulating material also in other climates. Future knowledge on hygrothermal mass potential in materials is promising. When including the time-dependent buffer capacity of materials, building operators can achieve optimal efficiency of their ventilation systems. The use of wood as a climate regulating material, through its potential hygrothermal mass and connected to weather prognoses, may provide up to 25% energy savings related to heating, cooling, and ventilation of a building.Keywords: climate buffer, energy, hygrothermal mass, ventilation, wood, weather forecast
Procedia PDF Downloads 215986 Development of Hybrid Materials Combining Biomass as Fique Fibers with Metal-Organic Frameworks, and Their Potential as Mercury Adsorbents
Authors: Karen G. Bastidas Gomez, Hugo R. Zea Ramirez, Manuel F. Ribeiro Pereira, Cesar A. Sierra Avila, Juan A. Clavijo Morales
Abstract:
The contamination of water sources with heavy metals such as mercury has been an environmental problem; it has generated a high impact on the environment and human health. In countries such as Colombia, mercury contamination due to mining has reached levels much higher than the world average. This work proposes the use of fique fibers as adsorbent in mercury removal. The evaluation of the material was carried out under five different conditions (raw, pretreated by organosolv, functionalized by TEMPO oxidation, fiber functionalized plus MOF-199 and fiber functionalized plus MOF-199-SH). All the materials were characterized using FTIR, SEM, EDX, XRD, and TGA. Regarding the mercury removal, it was done under room pressure and temperature, also pH = 7 for all materials presentations, followed by Atomic Absorption Spectroscopy. The high cellulose content in fique is the main particularity of this lignocellulosic biomass since the degree of oxidation depends on the number of hydroxyl groups on the surface capable of oxidizing into carboxylic acids, a functional group capable of increasing ion exchange with mercury in solution. It was also expected that the impregnation of the MOF would increase the mercury removal; however, it was found that the functionalized fique achieved a greater percentage of removal, resulting in 81.33% of removal, 44% for the fique with the MOF-199 and 72% for the MOF-199-SH with. The pretreated fiber and raw also showed 74% and 56%, respectively, which indicates that fique does not require considerable modifications in its structure to achieve good performances. Even so, the functionalized fiber increases the percentage of removal considerably compared to the pretreated fique, which suggests that the functionalization process is a feasible procedure to apply with the purpose of improving the removal percentage. In addition, this is a procedure that follows a green approach since the reagents involved have low environmental impact, and the contribution to the remediation of natural resources is high.Keywords: biomass, nanotechnology, science materials, wastewater treatment
Procedia PDF Downloads 117985 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator
Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić
Abstract:
Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.Keywords: CT simulator, radiotherapy, quality control, QA programme
Procedia PDF Downloads 532984 Cable De-Commissioning of Legacy Accelerators at CERN
Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson
Abstract:
CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.Keywords: CERN, de-cabling, injectors, quality assurance procedure
Procedia PDF Downloads 92983 Study of Irritant and Anti-inflammatory Activity of Snuhi/Zaqqum (Euphorbia nerifolia) with Special Reference to Holy Quran and Ayurveda
Authors: Mohammed Khalil Ur Rahman, Pradnya Chigle, Bushra Farhen
Abstract:
Indian mythology believes that Vedas are eternal treatises. Vedas are categorized into four divisions viz., Rigveda, Yajurveda, Samveda, Atharveda. All these spiritual classics not only deal with rituals and customs but also consist of inclusion of many references related to health. Out of these four, Atharveda deals with maximum principles pertaining to health sciences. Therefore, it is said that the science and the art of Ayurveda has developed from Atharveda. Ayurveda deals with many medicinal plants either as a single therapeutic use or in combination. One such medicinal plant is Snuhi (Euphorbia neriifolia Linn.) which finds its extensive importance along with Haridra and Apamargakshar, in the preparation of Ksharsutra which in turn is used for the treatment of Fistula in Ano. It is interesting to note that this plant Snuhi is also referred in Holy Quran as the Tree of Zaqqum advocated as the food for the sinners as a part of torment. The reference in Surat Ad-Dukhan is as follows: - 44:43-46. “Verily, the tree of Zaqqum will be the food of the sinners, Like boiling oil, it will boil in the bellies, like the boiling of scalding water.” The above verse implies that plant Snuhi/Zaqqum due to irritant property acts as a drastic purgative but at the same time it also possesses anti inflammatory properties in order to relieve the irritation. These properties of Zaqqum has been unfolded in the modern research which states that, Diterpene polycyclic esters are responsible for its toxic and irritant nature whereas; triterpenes are responsible for its anti inflammatory property. Present work will be an effort to review the concept of Quran about latex of the Tree of Zaqqum in terms of its phytochemistry and its therapeutic use in Ksharsutra pertaining to irritant and anti inflammatory property.Keywords: ayurveda, Quran, zaqqum, ksharsutra, latex piles, inflammation
Procedia PDF Downloads 353982 The Effects of Training Load on Some Selected Fitness Variables in the Case of U-17 Female Volleyball Project Players, Central Ethiopia
Authors: Behailu Shigute Habtemariam
Abstract:
The aim of the study was to examine the effects of training load on some selected fitness performance variables of volleyball players in the case of U-17 female volleyball project players in the central Ethiopia region. Methods: In this study, quasi-experimental design was used. For the purpose of this study, twenty-three volleyball players were used as a subject from the two projects. The data collected through fitness performance assessment were analyzed and interpreted into a meaningful idea using manually as well as with computer in order to compare physical fitness variables and changes observed among participants. Those are taking part in the effects of training load on some selected physical fitness variables. The collected data were analyzed by means of the Statistical Package for Social Science version (SPSS V 20). The independent t-test was used to show the mean differences between the groups, and the paired T-test was used to compare the mean differences of the pre and post-training within each group. The level of significance will be set at Alfa 0.05. Results: The results are displayed using tables and figures. A significant difference was found in the mean differences of pre-test values (19.7 cm) and post-test values (37.5 cm) of the Durame city project on the flexibility test (MD =17.8 cm, P = 0.00). On the other hand, there was a significant difference in the mean difference of pre-test values of (18 cm) and post-test values (26.3 cm) of the Hosana city project on the flexibility test ( MD = 8.3 cm, P = 0.00). Conclusion: According to the results of the present studies, there were significant improvements from pre to post-test at Durame City and Hosana City projects on agility, flexibility, power, and speed fitness tests. On the other hand, a significant difference was not found before beginning the exercise between the two projects; however, a significant difference was found after 12 weeks of training.Keywords: overload, performance, training, volleyball
Procedia PDF Downloads 96981 Relationship among the Air Pollution and Atopic Dermatitis Using Meta-Analysis
Authors: Chaebong Kim, Yongmin Cho, Minkyung Han, Mooyoung Kim, KooSang Kim
Abstract:
Background: Air pollution from global warming has a considerable influence on respiratory disease and atopic dermatitis (AD). Present studies base on a hypothesis about correlation between air pollutant and AD, and the results are analyzed from various points of view. Objectives: This study aimed to integrate the relevant researches for air pollutant and AD, and to perform the systematic literature review and meta-analysis to provide the basis of air pollutant control. Methods: Research materials were collected from original articles published in English academic journals including medicine, nursing and health science from August 1 to 31, 2016. We collected the materials from Pubmed, Medline, Embase, Cochrane Central database with Prisma (Preferred Reporting Items for Systematic Reviews and Meta-Analysis) based on the Cochrane Systematic Review Manual, and performed the evaluation and analysis for selected materials. We got the research results for risk of bias using Rev-Man ver. 5.2, and meta analyses using STATA. Results: The prevalence of infantile atopic dermatitis were 1.05 times higher than other groups who were exposed to air pollution, and exposure to NO2 (1.08, 95% CI: 1.02 – 1.14), O3 (1.09, 95% CI: 1.04 – 1.15), SO2 (1.07, 95% CI: 1.02 – 1.12) in subgroup air pollutant was considerably associated with infantile atopic dermatitis. The prevalence of infantile atopic dermatitis was 1.03 times higher than other groups who were exposed to PM2.5, but the results were not statistically similar. Conclusion: Health effect from environmental pollution risen people’s interest in environmental diseases. Air pollutant was associated with AD in this study, but selected literature was based on non-RCT (Randomized Controlled Trial) study. Therefore, there was a limit in study method including control, matching, and correction of confounding variables. For clear conclusion, it is necessary to develop the appropriate tool for object of study and clear standard to measure of air pollutant.Keywords: air pollution, atopic dermatitis, children, meta-analysis
Procedia PDF Downloads 257980 The Influence of Market Attractiveness and Core Competence on Value Creation Strategy and Competitive Advantage and Its Implication on Business Performance
Authors: Firsan Nova
Abstract:
The average Indonesian watches 5.5 hours of TV a day. With a population of 242 million people and a Free-to-Air (FTA) TV penetration rate of 56%, that equates to 745 million hours of television watched each day. With such potential, it is no wonder that many companies are now attempting to get into the Pay TV market. Research firm Media Partner Asia has forecast in its study that the number of Indonesian pay-television subscribers will climb from 2.4 million in 2012 to 8.7 million by 2020, with penetration scaling up from 7 percent to 21 percent. Key drivers of market growth, the study says, include macro trends built around higher disposable income and a rising middle class, with leading players continuing to invest significantly in sales, distribution and content. New entrants, in the meantime, will boost overall prospects. This study aims to examine and analyze the effect of Market Attractiveness and the Core Competence on Value Creation and Competitive Advantage and its impact to Business Performance in the pay TV industry in Indonesia. The study using strategic management science approach with the census method in which all members of the population are as sample. Verification method is used to examine the relationship between variables. The unit of analysis in this research is all Indonesian Pay TV business units totaling 19 business units. The unit of observation is the director and managers of each business unit. Hypothesis testing is performed by using statistical Partial Least Square (PLS). The conclusion of the study shows that the market attractiveness affects business performance through value creation and competitive advantage. The appropriate value creation comes from the company ability to optimize its core competence and exploit market attractiveness. Value creation affects competitive advantage. The competitive advantage can be determined based on the company's ability to create value for customers and the competitive advantage has an impact on business performance.Keywords: market attractiveness, core competence, value creation, competitive advantage, business performance
Procedia PDF Downloads 348979 Accidental U.S. Taxpayers Residing Abroad: Choosing between U.S. Citizenship or Keeping Their Local Investment Accounts
Authors: Marco Sewald
Abstract:
Due to the current enforcement of exterritorial U.S. legislation, up to 9 million U.S. (dual) citizens residing abroad are subject to U.S. double and surcharge taxation and at risk of losing access to otherwise basic financial services and investment opportunities abroad. The United States is the only OECD country that taxes non-resident citizens, lawful permanent residents and other non-resident aliens on their worldwide income, based on local U.S. tax laws. To enforce these policies the U.S. has implemented ‘saving clauses’ in all tax treaties and implemented several compliance provisions, including the Foreign Account Tax Compliance Act (FATCA), Qualified Intermediaries Agreements (QI) and Intergovernmental Agreements (IGA) addressing Foreign Financial Institutions (FFIs) to implement these provisions in foreign jurisdictions. This policy creates systematic cases of double and surcharge taxation. The increased enforcement of compliance rules is creating additional report burdens for U.S. persons abroad and FFIs accepting such U.S. persons as customers. FFIs in Europe react with a growing denial of specific financial services to this population. The numbers of U.S. citizens renouncing has dramatically increased in the last years. A case study is chosen as an appropriate methodology and research method, as being an empirical inquiry that investigates a contemporary phenomenon within its real-life context; when the boundaries between phenomenon and context are not clearly evident; and in which multiple sources of evidence are used. This evaluative approach is testing whether the combination of policies works in practice, or whether they are in accordance with desirable moral, political, economical aims, or may serve other causes. The research critically evaluates the financial and non-financial consequences and develops sufficient strategies. It further discusses these strategies to avoid the undesired consequences of exterritorial U.S. legislation. Three possible strategies are resulting from the use cases: (1) Duck and cover, (2) Pay U.S. double/surcharge taxes, tax preparing fees and accept imposed product limitations and (3) Renounce U.S. citizenship and pay possible exit taxes, tax preparing fees and the requested $2,350 fee to renounce. While the first strategy is unlawful and therefore unsuitable, the second strategy is only suitable if the U.S. citizen residing abroad is planning to move to the U.S. in the future. The last strategy is the only reasonable and lawful way provided by the U.S. to limit the exposure to U.S. double and surcharge taxation and the limitations on financial products. The results are believed to add a perspective to the current academic discourse regarding U.S. citizenship based taxation, currently dominated by U.S. scholars, while providing sufficient strategies for the affected population at the same time.Keywords: citizenship based taxation, FATCA, FBAR, qualified intermediaries agreements, renounce U.S. citizenship
Procedia PDF Downloads 201978 Using Hierarchical Methodology to Assist the Selection of New Business in Brazilian Companies Incubators
Authors: Izabel Cristina Zattar, Gilberto Passos Lima, Guilherme Schünemann de Oliveira
Abstract:
In Brazil, there are several institutions committed to the development of new businesses based on product innovation. Among them are business incubators, universities and science institutes. Business incubators can be defined as nurseries for new companies, which may be in the technology segment, discussed in this article. Business incubators provide services related to infrastructure, such as physical space and meeting rooms. Besides these services, incubators also offer assistance in the form of information and communication, access to finance, relationship networks and business monitoring and mentoring processes. Business incubators support not all technology companies. One of the business incubators tasks is to assess the nature and feasibility of new business proposals. To assist in this goal, this paper proposes a methodology for evaluating new business using the Analytic Hierarchy Process (AHP). This paper presents the concepts used in the assessing methodology application for new business, concepts that have been tested with positive results in practice. This study counts on three main steps: first, a hierarchy was built, based on new business manuals used by the business incubators. These books and manuals relate business selection requirements, such as the innovation status and other technological aspects. Then, a questionnaire was generated, in order to guide incubator experts in the parity comparisons at all hierarchy levels. The weights of each requirement are calculated from information obtained from the questionnaire responses. Finally, the proposed method was applied to evaluate five new business proposals, which were applying to be part of a company incubator. The main result is the classification of these new businesses, which helped the incubator experts to decide what companies were more eligible to work with. This classification may also be helpful to the decision-making process of business incubators in future selection processes.Keywords: Analytic Hierarchy Process (AHP), Brazilian companies incubators, technology companies, incubator
Procedia PDF Downloads 400977 Energy Consumption and Economic Growth Nexus: a Sustainability Understanding from the BRICS Economies
Authors: Smart E. Amanfo
Abstract:
Although the exact functional relationship between energy consumption and economic growth and development remains a complex social science, there is a sustained growing of agreement among energy economists and the likes on direct or indirect role of energy use in the development process, and as sustenance for many of societal achieved socio-economic and environmental developments in any economy. According to OECD, the world economy will double by 2050 in which the two members of BRICS (Brazil, Russia, India, China and South Africa) countries: China and India lead. There is a global apprehension that if countries constituting the epicenter of the present and future economic growth follow the same trajectory as during and after Industrial Revolution, involving higher energy throughputs, especially fossil fuels, the already known and models predicted threats of climate change and global warming could be exacerbated, especially in the developing economies. The international community’s challenge is how to address the trilemma of economic growth, social development, poverty eradication and stability of the ecological systems. This paper aims at providing the estimates of economic growth, energy consumption, and carbon dioxide emissions using BRICS members’ panel data from 1980 to 2017. The preliminary results based on fixed effect econometric model show positive significant relationship between energy consumption and economic growth. The paper further identified a strong relationship between economic growth and CO2 emissions which suggests that the global agenda of low-carbon-led growth and development is not a straight forward achievable The study therefore highlights the need for BRICS member states to intensify low-emissions-based production and consumption policies, increase renewables in order to avoid further deterioration of climate change impacts.Keywords: BRICS, sustainability, sustainable development, energy consumption, economic growth
Procedia PDF Downloads 94976 Fabrication of Durable and Renegerable Superhydrophobic Coatings on Metallic Surfaces for Potential Industrial Applications
Authors: Priya Varshney, Soumya S. Mohapatra
Abstract:
Fabrication of anti-corrosion and self-cleaning superhydrophobic coatings for metallic surfaces which are regenerable and durable in the aggressive conditions has shown tremendous interest in materials science. In this work, the superhydrophobic coatings on metallic surfaces (aluminum, steel, copper) were prepared by two-step and one-step chemical etching process. In two-step process, roughness on surface was created by chemical etching and then passivation of roughened surface with low surface energy materials whereas, in one-step process, roughness on surface by chemical etching and passivation of surface with low surface energy materials were done in a single step. Beside this, the effect of etchant concentration and etching time on wettability and morphology was also studied. Thermal, mechanical, ultra-violet stability of these coatings were also tested. Along with this, regeneration of coatings and self-cleaning, corrosion resistance and water repelling characteristics were also studied. The surface morphology shows the presence of a rough microstuctures on the treated surfaces and the contact angle measurements confirms the superhydrophobic nature. It is experimentally observed that the surface roughness and contact angle increases with increase in etching time as well as with concentration of etchant. Superhydrophobic surfaces show the excellent self-cleaning behaviour. Coatings are found to be stable and maintain their superhydrophobicity in acidic and alkaline solutions. Water jet impact, floatation on water surface, and low temperature condensation tests prove the water-repellent nature of the coatings. These coatings are found to be thermal, mechanical and ultra-violet stable. These durable superhydrophobic metallic surfaces have potential industrial applications.Keywords: superhydrophobic, water-repellent, anti-corrosion, self-cleaning
Procedia PDF Downloads 279975 The Properties of Risk-based Approaches to Asset Allocation Using Combined Metrics of Portfolio Volatility and Kurtosis: Theoretical and Empirical Analysis
Authors: Maria Debora Braga, Luigi Riso, Maria Grazia Zoia
Abstract:
Risk-based approaches to asset allocation are portfolio construction methods that do not rely on the input of expected returns for the asset classes in the investment universe and only use risk information. They include the Minimum Variance Strategy (MV strategy), the traditional (volatility-based) Risk Parity Strategy (SRP strategy), the Most Diversified Portfolio Strategy (MDP strategy) and, for many, the Equally Weighted Strategy (EW strategy). All the mentioned approaches were based on portfolio volatility as a reference risk measure but in 2023, the Kurtosis-based Risk Parity strategy (KRP strategy) and the Minimum Kurtosis strategy (MK strategy) were introduced. Understandably, they used the fourth root of the portfolio-fourth moment as a proxy for portfolio kurtosis to work with a homogeneous function of degree one. This paper contributes mainly theoretically and methodologically to the framework of risk-based asset allocation approaches with two steps forward. First, a new and more flexible objective function considering a linear combination (with positive coefficients that sum to one) of portfolio volatility and portfolio kurtosis is used to alternatively serve a risk minimization goal or a homogeneous risk distribution goal. Hence, the new basic idea consists in extending the achievement of typical risk-based approaches’ goals to a combined risk measure. To give the rationale behind operating with such a risk measure, it is worth remembering that volatility and kurtosis are expressions of uncertainty, to be read as dispersion of returns around the mean and that both preserve adherence to a symmetric framework and consideration for the entire returns distribution as well, but also that they differ from each other in that the former captures the “normal” / “ordinary” dispersion of returns, while the latter is able to catch the huge dispersion. Therefore, the combined risk metric that uses two individual metrics focused on the same phenomena but differently sensitive to its intensity allows the asset manager to express, in the context of an objective function by varying the “relevance coefficient” associated with the individual metrics, alternatively, a wide set of plausible investment goals for the portfolio construction process while serving investors differently concerned with tail risk and traditional risk. Since this is the first study that also implements risk-based approaches using a combined risk measure, it becomes of fundamental importance to investigate the portfolio effects triggered by this innovation. The paper also offers a second contribution. Until the recent advent of the MK strategy and the KRP strategy, efforts to highlight interesting properties of risk-based approaches were inevitably directed towards the traditional MV strategy and SRP strategy. Previous literature established an increasing order in terms of portfolio volatility, starting from the MV strategy, through the SRP strategy, arriving at the EQ strategy and provided the mathematical proof for the “equalization effect” concerning marginal risks when the MV strategy is considered, and concerning risk contributions when the SRP strategy is considered. Regarding the validity of similar conclusions when referring to the MK strategy and KRP strategy, the development of a theoretical demonstration is still pending. This paper fills this gap.Keywords: risk parity, portfolio kurtosis, risk diversification, asset allocation
Procedia PDF Downloads 65974 Significant Growth in Expected Muslim Inbound Tourists in Japan Towards 2020 Tokyo Olympic and Still Incipient Stage of Current Halal Implementations in Hiroshima
Authors: Kyoko Monden
Abstract:
Tourism has moved to the forefront of national attention in Japan since September of 2013 when Tokyo won its bid to host the 2020 Summer Olympics. The number of foreign tourists has continued to break records, reaching 13.4 million in 2014, and is now expected to hit 20 million sooner than initially targeted 2020 due to government stimulus promotions; an increase in low cost carriers; the weakening of the Japanese yen, and strong economic growth in Asia. The tourism industry can be an effective trigger in Japan’s economic recovery as foreign tourists spent two trillion yen ($16.6 million) in Japan in 2014. In addition, 81% of them were all from Asian countries, and it is essential to know that 68.9% of the world’s Muslims, about a billion people, live in South and Southeast Asia. An important question is ‘Do Muslim tourists feel comfortable traveling in Japan?’ This research was initiated by an encounter with Muslim visitors in Hiroshima, a popular international tourist destination, who said they had found very few suitable restaurants in Hiroshima. The purpose of this research is to examine halal implementation in Hiroshima and suggest the next steps to be taken to improve current efforts. The goal will be to provide anyone, Muslims included, with first class hospitality in the near future in preparation for the massive influx of foreign tourists in 2020. The methods of this research were questionnaires, face-to-face interviews, phone interviews, and internet research. First, this research aims to address the significance of growing inbound tourism in Japan, especially the expected growth in Muslim tourists. Additionally, it should address the strong popularity of eating Japanese foods in Asian Muslim countries and as ranked no. 1 thing foreign tourists want to do in Japan. Secondly, the current incipient stage of Hiroshima’s halal implementation at hotels, restaurants, and major public places were exposed, and the existing action plans by Hiroshima Prefecture Government were presented. Furthermore, two surveys were conducted to clarify basic halal awareness of local residents in Hiroshima, and to gauge the inconveniences Muslims living in Hiroshima faced. Thirdly, the reasons for this lapse were observed and compared to the benchmarking data of other major tourist sites, Hiroshima’s halal implementation plans were proposed. The conclusion is, despite increasing demands and interests in halal-friendly businesses, overall halal actions have barely been applied in Hiroshima. 76% of Hiroshima residents had no idea what halal or halaal meant. It is essential to increase halal awareness and its importance to the economy and to launch further actions to make Muslim tourists feel welcome in Hiroshima and the entire country.Keywords: halaal, halal implementation, Hiroshima, inbound tourists in Japan
Procedia PDF Downloads 223973 A Clinical Audit on Screening Women with Subfertility Using Transvaginal Scan and Hysterosalpingo Contrast Sonography
Authors: Aarti M. Shetty, Estela Davoodi, Subrata Gangooly, Anita Rao-Coppisetty
Abstract:
Background: Testing Patency of Fallopian Tubes is among one of the several protocols for investigating Subfertile Couples. Both, Hysterosalpingogram (HSG) and Laparoscopy and dye test have been used as Tubal patency test for several years, with well-known limitation. Hysterosalpingo Contrast Sonography (HyCoSy) can be used as an alternative tool to HSG, to screen patency of Fallopian tubes, with an advantage of being non-ionising, and also, use of transvaginal scan to diagnose pelvic pathology. Aim: To determine the indication and analyse the performance of transvaginal scan and HyCoSy in Broomfield Hospital. Methods: We retrospectively analysed fertility workup of 282 women, who attended HyCoSy clinic at our institution from January 2015 to June 2016. An Audit proforma was designed, to aid data collection. Data was collected from patient notes and electronic records, which included patient demographics; age, parity, type of subfertility (primary or secondary), duration of subfertility, past medical history and base line investigation (hormone profile and semen analysis). Findings of the transvaginal scan, HyCoSy and Laparoscopy were also noted. Results: The most common indication for referral were as a part of primary fertility workup on couples who had failure to conceive despite intercourse for a year, other indication for referral were recurrent miscarriage, history of ectopic pregnancy, post reversal of sterilization(vasectomy and tuboplasty), Post Gynaecology surgery(Loop excision, cone biopsy) and amenorrhea. Basic Fertility workup showed 34% men had abnormal semen analysis. HyCoSy was successfully completed in 270 (95%) women using ExEm foam and Transvaginal Scan. Of the 270 patients, 535 tubes were examined in total. 495/535 (93%) tubes were reported as patent, 40/535 (7.5%) tubes were reported as blocked. A total of 17 (6.3%) patients required laparoscopy and dye test after HyCoSy. In these 17 patients, 32 tubes were examined under laparoscopy, and 21 tubes had findings similar to HyCoSy, with a concordance rate of 65%. In addition to this, 41 patients had some form of pelvic pathology (endometrial polyp, fibroid, cervical polyp, fibroid, bicornuate uterus) detected during transvaginal scan, who referred to corrective surgeries after attending HyCoSy Clinic. Conclusion: Our audit shows that HyCoSy and Transvaginal scan can be a reliable screening test for low risk women. Furthermore, it has competitive diagnostic accuracy to HSG in identifying tubal patency, with an additional advantage of screening for pelvic pathology. With addition of 3D Scan, pulse Doppler and other non-invasive imaging modality, HyCoSy may potentially replace Laparoscopy and chromopertubation in near future.Keywords: hysterosalpingo contrast sonography (HyCoSy), transvaginal scan, tubal infertility, tubal patency test
Procedia PDF Downloads 251972 The Impact of Anxiety on the Access to Phonological Representations in Beginning Readers and Writers
Authors: Regis Pochon, Nicolas Stefaniak, Veronique Baltazart, Pamela Gobin
Abstract:
Anxiety is known to have an impact on working memory. In reasoning or memory tasks, individuals with anxiety tend to show longer response times and poorer performance. Furthermore, there is a memory bias for negative information in anxiety. Given the crucial role of working memory in lexical learning, anxious students may encounter more difficulties in learning to read and spell. Anxiety could even affect an earlier learning, that is the activation of phonological representations, which are decisive for the learning of reading and writing. The aim of this study is to compare the access to phonological representations of beginning readers and writers according to their level of anxiety, using an auditory lexical decision task. Eighty students of 6- to 9-years-old completed the French version of the Revised Children's Manifest Anxiety Scale and were then divided into four anxiety groups according to their total score (Low, Median-Low, Median-High and High). Two set of eighty-one stimuli (words and non-words) have been auditory presented to these students by means of a laptop computer. Stimuli words were selected according to their emotional valence (positive, negative, neutral). Students had to decide as quickly and accurately as possible whether the presented stimulus was a real word or not (lexical decision). Response times and accuracy were recorded automatically on each trial. It was anticipated a) longer response times for the Median-High and High anxiety groups in comparison with the two others groups, b) faster response times for negative-valence words in comparison with positive and neutral-valence words only for the Median-High and High anxiety groups, c) lower response accuracy for Median-High and High anxiety groups in comparison with the two others groups, d) better response accuracy for negative-valence words in comparison with positive and neutral-valence words only for the Median-High and High anxiety groups. Concerning the response times, our results showed no difference between the four groups. Furthermore, inside each group, the average response times was very close regardless the emotional valence. Otherwise, group differences appear when considering the error rates. Median-High and High anxiety groups made significantly more errors in lexical decision than Median-Low and Low groups. Better response accuracy, however, is not found for negative-valence words in comparison with positive and neutral-valence words in the Median-High and High anxiety groups. Thus, these results showed a lower response accuracy for above-median anxiety groups than below-median groups but without specificity for the negative-valence words. This study suggests that anxiety can negatively impact the lexical processing in young students. Although the lexical processing speed seems preserved, the accuracy of this processing may be altered in students with moderate or high level of anxiety. This finding has important implication for the prevention of reading and spelling difficulties. Indeed, during these learnings, if anxiety affects the access to phonological representations, anxious students could be disturbed when they have to match phonological representations with new orthographic representations, because of less efficient lexical representations. This study should be continued in order to precise the impact of anxiety on basic school learning.Keywords: anxiety, emotional valence, childhood, lexical access
Procedia PDF Downloads 288971 A Geographical Information System Supported Method for Determining Urban Transformation Areas in the Scope of Disaster Risks in Kocaeli
Authors: Tayfun Salihoğlu
Abstract:
Following the Law No: 6306 on Transformation of Disaster Risk Areas, urban transformation in Turkey found its legal basis. In the best practices all over the World, the urban transformation was shaped as part of comprehensive social programs through the discourses of renewing the economic, social and physical degraded parts of the city, producing spaces resistant to earthquakes and other possible disasters and creating a livable environment. In Turkish practice, a contradictory process is observed. In this study, it is aimed to develop a method for better understanding of the urban space in terms of disaster risks in order to constitute a basis for decisions in Kocaeli Urban Transformation Master Plan, which is being prepared by Kocaeli Metropolitan Municipality. The spatial unit used in the study is the 50x50 meter grids. In order to reflect the multidimensionality of urban transformation, three basic components that have spatial data in Kocaeli were identified. These components were named as 'Problems in Built-up Areas', 'Disaster Risks arising from Geological Conditions of the Ground and Problems of Buildings', and 'Inadequacy of Urban Services'. Each component was weighted and scored for each grid. In order to delimitate urban transformation zones Optimized Outlier Analysis (Local Moran I) in the ArcGIS 10.6.1 was conducted to test the type of distribution (clustered or scattered) and its significance on the grids by assuming the weighted total score of the grid as Input Features. As a result of this analysis, it was found that the weighted total scores were not significantly clustering at all grids in urban space. The grids which the input feature is clustered significantly were exported as the new database to use in further mappings. Total Score Map reflects the significant clusters in terms of weighted total scores of 'Problems in Built-up Areas', 'Disaster Risks arising from Geological Conditions of the Ground and Problems of Buildings' and 'Inadequacy of Urban Services'. Resulting grids with the highest scores are the most likely candidates for urban transformation in this citywide study. To categorize urban space in terms of urban transformation, Grouping Analysis in ArcGIS 10.6.1 was conducted to data that includes each component scores in significantly clustered grids. Due to Pseudo Statistics and Box Plots, 6 groups with the highest F stats were extracted. As a result of the mapping of the groups, it can be said that 6 groups can be interpreted in a more meaningful manner in relation to the urban space. The method presented in this study can be magnified due to the availability of more spatial data. By integrating with other data to be obtained during the planning process, this method can contribute to the continuation of research and decision-making processes of urban transformation master plans on a more consistent basis.Keywords: urban transformation, GIS, disaster risk assessment, Kocaeli
Procedia PDF Downloads 120970 Applications of Digital Tools, Satellite Images and Geographic Information Systems in Data Collection of Greenhouses in Guatemala
Authors: Maria A. Castillo H., Andres R. Leandro, Jose F. Bienvenido B.
Abstract:
During the last 20 years, the globalization of economies, population growth, and the increase in the consumption of fresh agricultural products have generated greater demand for ornamentals, flowers, fresh fruits, and vegetables, mainly from tropical areas. This market situation has demanded greater competitiveness and control over production, with more efficient protected agriculture technologies, which provide greater productivity and allow us to guarantee the quality and quantity that is required in a constant and sustainable way. Guatemala, located in the north of Central America, is one of the largest exporters of agricultural products in the region and exports fresh vegetables, flowers, fruits, ornamental plants, and foliage, most of which were grown in greenhouses. Although there are no official agricultural statistics on greenhouse production, several thesis works, and congress reports have presented consistent estimates. A wide range of protection structures and roofing materials are used, from the most basic and simple ones for rain control to highly technical and automated structures connected with remote sensors for monitoring and control of crops. With this breadth of technological models, it is necessary to analyze georeferenced data related to the cultivated area, to the different existing models, and to the covering materials, integrated with altitude, climate, and soil data. The georeferenced registration of the production units, the data collection with digital tools, the use of satellite images, and geographic information systems (GIS) provide reliable tools to elaborate more complete, agile, and dynamic information maps. This study details a methodology proposed for gathering georeferenced data of high protection structures (greenhouses) in Guatemala, structured in four phases: diagnosis of available information, the definition of the geographic frame, selection of satellite images, and integration with an information system geographic (GIS). It especially takes account of the actual lack of complete data in order to obtain a reliable decision-making system; this gap is solved through the proposed methodology. A summary of the results is presented in each phase, and finally, an evaluation with some improvements and tentative recommendations for further research is added. The main contribution of this study is to propose a methodology that allows to reduce the gap of georeferenced data in protected agriculture in this specific area where data is not generally available and to provide data of better quality, traceability, accuracy, and certainty for the strategic agricultural decision öaking, applicable to other crops, production models and similar/neighboring geographic areas.Keywords: greenhouses, protected agriculture, GIS, Guatemala, satellite image, digital tools, precision agriculture
Procedia PDF Downloads 194