Search results for: WSN design requirements
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13940

Search results for: WSN design requirements

8570 Development of Peptide Inhibitors against Dengue Virus Infection by in Silico Design

Authors: Aussara Panya, Nunghathai Sawasdee, Mutita Junking, Chatchawan Srisawat, Kiattawee Choowongkomon, Pa-Thai Yenchitsomanus

Abstract:

Dengue virus (DENV) infection is a global public health problem with approximately 100 million infected cases a year. Presently, there is no approved vaccine or effective drug available; therefore, the development of anti-DENV drug is urgently needed. The clinical reports revealing the positive association between the disease severity and viral titer has been reported previously suggesting that the anti-DENV drug therapy can possibly ameliorate the disease severity. Although several anti-DENV agents showed inhibitory activities against DENV infection, to date none of them accomplishes clinical use in the patients. The surface envelope (E) protein of DENV is critical for the viral entry step, which includes attachment and membrane fusion; thus, the blocking of envelope protein is an attractive strategy for anti-DENV drug development. To search the safe anti-DENV agent, this study aimed to search for novel peptide inhibitors to counter DENV infection through the targeting of E protein using a structure-based in silico design. Two selected strategies has been used including to identify the peptide inhibitor which interfere the membrane fusion process whereby the hydrophobic pocket on the E protein was the target, the destabilization of virion structure organization through the disruption of the interaction between the envelope and membrane proteins, respectively. The molecular docking technique has been used in the first strategy to search for the peptide inhibitors that specifically bind to the hydrophobic pocket. The second strategy, the peptide inhibitor has been designed to mimic the ectodomain portion of membrane protein to disrupt the protein-protein interaction. The designed peptides were tested for the effects on cell viability to measure the toxic to peptide to the cells and their inhibitory assay to inhibit the DENV infection in Vero cells. Furthermore, their antiviral effects on viral replication, intracellular protein level and viral production have been observed by using the qPCR, cell-based flavivirus immunodetection and immunofluorescence assay. None of tested peptides showed the significant effect on cell viability. The small peptide inhibitors achieved from molecular docking, Glu-Phe (EF), effectively inhibited DENV infection in cell culture system. Its most potential effect was observed for DENV2 with a half maximal inhibition concentration (IC50) of 96 μM, but it partially inhibited other serotypes. Treatment of EF at 200 µM on infected cells also significantly reduced the viral genome and protein to 83.47% and 84.15%, respectively, corresponding to the reduction of infected cell numbers. An additional approach was carried out by using peptide mimicking membrane (M) protein, namely MLH40. Treatment of MLH40 caused the reduction of foci formation in four individual DENV serotype (DENV1-4) with IC50 of 24-31 μM. Further characterization suggested that the MLH40 specifically blocked viral attachment to host membrane, and treatment with 100 μM could diminish 80% of viral attachment. In summary, targeting the hydrophobic pocket and M-binding site on the E protein by using the peptide inhibitors could inhibit DENV infection. The results provide proof of-concept for the development of antiviral therapeutic peptide inhibitors to counter DENV infection through the use of a structure-based design targeting conserved viral protein.

Keywords: dengue virus, dengue virus infection, drug design, peptide inhibitor

Procedia PDF Downloads 340
8569 Underground Coal Gasification Technology in Türkiye: A Techno-Economic Assessment

Authors: Fatma Ünal, Hasancan Okutan

Abstract:

Increasing worldwide population and technological requirements lead to an increase in energy demand every year. The demand has been mainly supplied from fossil fuels such as coal and petroleum due to insufficient natural gas resources. In recent years, the amount of coal reserves has reached almost 21 billion tons in Türkiye. These are mostly lignite (%92,7), that contains high levels of moisture and sulfur components. Underground coal gasification technology is one of the most suitable methods in comparison with direct combustion techniques for the evaluation of such coal types. In this study, the applicability of the underground coal gasification process is investigated in the Eskişehir-Alpu lignite reserve as a pilot region, both technologically and economically. It is assumed that the electricity is produced from the obtained synthesis gas in an integrated gasification combined cycle (IGCC). Firstly, an equilibrium model has been developed by using the thermodynamic properties of the gasification reactions. The effect of the type of oxidizing gas, the sulfur content of coal, the rate of water vapor/air, and the pressure of the system have been investigated to find optimum process conditions. Secondly, the parallel and linear controlled recreation and injection point (CRIP) models were implemented as drilling methods, and costs were calculated under the different oxidizing agents (air and high-purity O2). In Parallel CRIP (P-CRIP), drilling cost is found to be lower than the linear CRIP (L-CRIP) since two coal beds simultaneously are gasified. It is seen that CO2 Capture and Storage (CCS) technology was the most effective unit on the total cost in both models. The cost of the synthesis gas produced varies between 0,02 $/Mcal and 0,09 $/Mcal. This is the promising result when considering the selling price of Türkiye natural gas for Q1-2023 (0.103 $ /Mcal).

Keywords: energy, lignite reserve, techno-economic analysis, underground coal gasification.

Procedia PDF Downloads 50
8568 Collective Potential: A Network of Acupuncture Interventions for Flood Resilience

Authors: Sachini Wickramanayaka

Abstract:

The occurrence of natural disasters has increased in an alarming rate in recent times due to escalating effects of climate change. One such natural disaster that has continued to grow in frequency and intensity is ‘flooding’, adversely affecting communities around the globe. This is an exploration on how architecture can intervene and facilitate in preserving communities in the face of disaster, specifically in battling floods. ‘Resilience’ is one of the concepts that have been brought forward to be instilled in vulnerable communities to lower the impact from such disasters as a preventative and coping mechanism. While there are number of ways to achieve resilience in the built environment, this paper aims to create a synthesis between resilience and ‘urban acupuncture’. It will consider strengthening communities from within, by layering a network of relatively small-scale, fast phased interventions on pre-existing conventional flood preventative large-scale engineering infrastructure.By investigating ‘The Woodlands’, a planned neighborhood as a case study, this paper will argue that large-scale water management solutions while extremely important will not suffice as a single solution particularly during a time of frequent and extreme weather events. The different projects will try to synthesize non-architectural aspects such as neighborhood aspirations, requirements, potential and awareness into a network of architectural forms that would collectively increase neighborhood resiliency to floods. A mapping study of the selected study area will identify the problematic areas that flood in the neighborhood while the empirical data from previously implemented case studies will assess the success of each solution.If successful the different solutions for each of the identified problem areas will exhibithow flooding and water management can be integrated as part and parcel of daily life.

Keywords: acupuncture, architecture, resiliency, micro-interventions, neighborhood

Procedia PDF Downloads 152
8567 Design and Fabrication of Stiffness Reduced Metallic Locking Compression Plates through Topology Optimization and Additive Manufacturing

Authors: Abdulsalam A. Al-Tamimi, Chris Peach, Paulo Rui Fernandes, Paulo J. Bartolo

Abstract:

Bone fixation implants currently used to treat traumatic fractured bones and to promote fracture healing are built with biocompatible metallic materials such as stainless steel, cobalt chromium and titanium and its alloys (e.g., CoCrMo and Ti6Al4V). The noticeable stiffness mismatch between current metallic implants and host bone associates with negative outcomes such as stress shielding which causes bone loss and implant loosening leading to deficient fracture treatment. This paper, part of a major research program to design the next generation of bone fixation implants, describes the combined use of three-dimensional (3D) topology optimization (TO) and additive manufacturing powder bed technology (Electron Beam Melting) to redesign and fabricate the plates based on the current standard one (i.e., locking compression plate). Topology optimization is applied with an objective function to maximize the stiffness and constraint by volume reductions (i.e., 25-75%) in order to obtain optimized implant designs with reduced stress shielding phenomenon, under different boundary conditions (i.e., tension, bending, torsion and combined loads). The stiffness of the original and optimised plates are assessed through a finite-element study. The TO results showed actual reduction in the stiffness for most of the plates due to the critical values of volume reduction. Additionally, the optimized plates fabricated using powder bed techniques proved that the integration between the TO and additive manufacturing presents the capability of producing stiff reduced plates with acceptable tolerances.

Keywords: additive manufacturing, locking compression plate, finite element, topology optimization

Procedia PDF Downloads 188
8566 Simulation of Bird Strike on Airplane Wings by Using SPH Methodology

Authors: Tuğçe Kiper Elibol, İbrahim Uslan, Mehmet Ali Guler, Murat Buyuk, Uğur Yolum

Abstract:

According to the FAA report, 142603 bird strikes were reported for a period of 24 years, between 1990 – 2013. Bird strike with aerospace structures not only threaten the flight security but also cause financial loss and puts life in danger. The statistics show that most of the bird strikes are happening with the nose and the leading edge of the wings. Also, a substantial amount of bird strikes is absorbed by the jet engines and causes damage on blades and engine body. Crash proof designs are required to overcome the possibility of catastrophic failure of the airplane. Using computational methods for bird strike analysis during the product development phase has considerable importance in terms of cost saving. Clearly, using simulation techniques to reduce the number of reference tests can dramatically affect the total cost of an aircraft, where for bird strike often full-scale tests are considered. Therefore, development of validated numerical models is required that can replace preliminary tests and accelerate the design cycle. In this study, to verify the simulation parameters for a bird strike analysis, several different numerical options are studied for an impact case against a primitive structure. Then, a representative bird mode is generated with the verified parameters and collided against the leading edge of a training aircraft wing, where each structural member of the wing was explicitly modeled. A nonlinear explicit dynamics finite element code, LS-DYNA was used for the bird impact simulations. SPH methodology was used to model the behavior of the bird. Dynamic behavior of the wing superstructure was observed and will be used for further design optimization purposes.

Keywords: bird impact, bird strike, finite element modeling, smoothed particle hydrodynamics

Procedia PDF Downloads 310
8565 Re-Designing Community Foodscapes to Enhance Social Inclusion in Sustainable Urban Environments

Authors: Carles Martinez-Almoyna Gual, Jiwon Choi

Abstract:

Urban communities face risks of disintegration and segregation as a consequence of globalised migration processes towards urban environments. Linking social and cultural components with environmental and economic dimensions becomes the goal of all the disciplines that aim to shape more sustainable urban environments. Solutions require interdisciplinary approaches and the use of a complex array of tools. One of these tools is the implementation of urban farming, which provides a wide range of advantages for creating more inclusive spaces and integrated communities. Since food is strongly related to the values and identities of any cultural group, it can be used as a medium to promote social inclusion in the context of urban multicultural societies. By bringing people together into specific urban sites, food production can be integrated into multifunctional spaces while addressing social, economic and ecological goals. The goal of this research is to assess different approaches to urban agriculture by analysing three existing community gardens located in Newtown, a suburb of Wellington, New Zealand. As a context for developing research, Newtown offers different approaches to urban farming and is really valuable for observing current trends of socialization in diverse and multicultural societies. All three spaces are located on public land owned by Wellington City Council and confined to a small, complex and progressively denser urban area. The developed analysis was focused on social, cultural and physical dimensions, combining community engagement with different techniques of spatial assessment. At the same time, a detailed investigation of each community garden was conducted with comparative analysis methodologies. This multidirectional setting of the analysis was established for extracting from the case studies both specific and typological knowledge. Each site was analysed and categorised under three broad themes: people, space and food. The analysis revealed that all three case studies had really different spatial settings, different approaches to food production and varying profiles of supportive communities. The main differences identified were demographics, values, objectives, internal organization, appropriation, and perception of the space. The community gardens were approached as case studies for developing design research. Following participatory design processes with the different communities, the knowledge gained from the analysis was used for proposing changes in the physical environment. The end goal of the design research was to improve the capacity of the spaces to facilitate social inclusiveness. In order to generate tangible changes, a range of small, strategic and feasible spatial interventions was explored. The smallness of the proposed interventions facilitates implementation by reducing time frames, technical resources, funding needs, and legal processes, working within the community´s own realm. These small interventions are expected to be implemented over time as part of an ongoing collaboration between the different communities, the university, and the local council. The applied research methodology showcases the capacity of universities to develop civic engagement by working with real communities that have concrete needs and face overall threats of disintegration and segregation.

Keywords: community gardening, landscape architecture, participatory design, placemaking, social inclusion

Procedia PDF Downloads 112
8564 Save Lives: The Application of Geolocation-Awareness Service in Iranian Pre-hospital EMS Information Management System

Authors: Somayeh Abedian, Pirhossein Kolivand, Hamid Reza Lornejad, Amin Karampour, Ebrahim Keshavarz Safari

Abstract:

For emergency and relief service providers such as pre-hospital emergencies, quick arrival at the scene of an accident or any EMS mission is one of the most important requirements of effective service delivery. Response time (the interval between the time of the call and the time of arrival on scene) is a critical factor in determining the quality of pre-hospital Emergency Medical Services (EMS). This is especially important for heart attack, stroke, or accident patients. Location-based e-services can be broadly defined as any service that provides information pertinent to the current location of an active mobile handset or precise address of landline phone call at a specific time window, regardless of the underlying delivery technology used to convey the information. According to research, one of the effective methods of meeting this goal is determining the location of the caller via the cooperation of landline and mobile phone operators in the country. The follow-up of the Communications Regulatory Authority (CRA) organization has resulted in the receipt of two separate secured electronic web services. Thus, to ensure human privacy, a secure technical architecture was required for launching the services in the pre-hospital EMS information management system. In addition, to quicken medics’ arrival at the patient's bedside, rescue vehicles should make use of an intelligent transportation system to estimate road traffic using a GPS-based mobile navigation system independent of the Internet. This paper seeks to illustrate the architecture of the practical national model used by the Iranian EMS organization.

Keywords: response time, geographic location inquiry service (GLIS), location-based service (LBS), emergency medical services information system (EMSIS)

Procedia PDF Downloads 155
8563 Maximization of Lifetime for Wireless Sensor Networks Based on Energy Efficient Clustering Algorithm

Authors: Frodouard Minani

Abstract:

Since last decade, wireless sensor networks (WSNs) have been used in many areas like health care, agriculture, defense, military, disaster hit areas and so on. Wireless Sensor Networks consist of a Base Station (BS) and more number of wireless sensors in order to monitor temperature, pressure, motion in different environment conditions. The key parameter that plays a major role in designing a protocol for Wireless Sensor Networks is energy efficiency which is a scarcest resource of sensor nodes and it determines the lifetime of sensor nodes. Maximizing sensor node’s lifetime is an important issue in the design of applications and protocols for Wireless Sensor Networks. Clustering sensor nodes mechanism is an effective topology control approach for helping to achieve the goal of this research. In this paper, the researcher presents an energy efficiency protocol to prolong the network lifetime based on Energy efficient clustering algorithm. The Low Energy Adaptive Clustering Hierarchy (LEACH) is a routing protocol for clusters which is used to lower the energy consumption and also to improve the lifetime of the Wireless Sensor Networks. Maximizing energy dissipation and network lifetime are important matters in the design of applications and protocols for wireless sensor networks. Proposed system is to maximize the lifetime of the Wireless Sensor Networks by choosing the farthest cluster head (CH) instead of the closest CH and forming the cluster by considering the following parameter metrics such as Node’s density, residual-energy and distance between clusters (inter-cluster distance). In this paper, comparisons between the proposed protocol and comparative protocols in different scenarios have been done and the simulation results showed that the proposed protocol performs well over other comparative protocols in various scenarios.

Keywords: base station, clustering algorithm, energy efficient, sensors, wireless sensor networks

Procedia PDF Downloads 126
8562 Making Food Science Education and Research Activities More Attractive for University Students and Food Enterprises by Utilizing Open Innovative Space-Approach

Authors: Anna-Maria Saarela

Abstract:

At the Savonia University of Applied Sciences (UAS), curriculum and studies have been improved by applying an Open Innovation Space approach (OIS). It is based on multidisciplinary action learning. The key elements of OIS-ideology are work-life orientation, and student-centric communal learning. In this approach, every participant can learn from each other and innovations will be created. In this social innovation educational approach, all practices are carried out in close collaboration with enterprises in real-life settings, not in classrooms. As an example, in this paper, Savonia UAS’s Future Food RDI hub (FF) shows how OIS practices are implemented by providing food product development and consumer research services for enterprises in close collaboration with academicians, students and consumers. In particular one example of OIS experimentation in the field is provided by a consumer research carried out utilizing verbal analysis protocol combined with audio-visual observation (VAP-WAVO). In this case, all co-learners were acting together in supermarket settings to collect the relevant data for a product development and the marketing department of a company. The company benefitted from the results obtained, students were more satisfied with their studies, educators and academicians were able to obtain good evidence for further collaboration as well as renewing curriculum contents based on the requirements of working life. In addition, society will benefit over time as young university adults find careers more easily through their OIS related food science studies. Also this knowledge interaction model re-news education practices and brings working-life closer to educational research institutes.

Keywords: collaboration, education, food science, industry, knowledge transfer, RDI, student

Procedia PDF Downloads 360
8561 Daylight Performance of a Single Unit in Distinct Arrangements

Authors: Rifat Tabassoom

Abstract:

Recently multistoried housing projects are accelerating in the capital of Bangladesh- Dhaka, to house its massive population. Insufficient background research leads to a building design trend where a single unit is designed and then multiplied all through the buildings. Therefore, although having identical designs, all the units cannot perform evenly considering daylight, which also alters their household activities. This paper aims to understand if a single unit can be an optimum solution regarding daylight for a selected housing project.

Keywords: daylight, orientation, performance, simulations

Procedia PDF Downloads 106
8560 Examining the Design of a Scaled Audio Tactile Model for Enhancing Interpretation of Visually Impaired Visitors in Heritage Sites

Authors: A. Kavita Murugkar, B. Anurag Kashyap

Abstract:

With the Rights for Persons with Disabilities Act (RPWD Act) 2016, the Indian government has made it mandatory for all establishments, including Heritage Sites, to be accessible for People with Disabilities. However, recent access audit surveys done under the Accessible India Campaign by Ministry of Culture indicate that there are very few accessibility measures provided in the Heritage sites for people with disabilities. Though there are some measures for the mobility impaired, surveys brought out that there are almost no provisions for people with vision impairment (PwVI) in heritage sites thus depriving them of a reasonable physical & intellectual access that facilitates an enjoyable experience and enriching interpretation of the Heritage Site. There is a growing need to develop multisensory interpretative tools that can help the PwVI in perceiving heritage sites in the absence of vision. The purpose of this research was to examine the usability of an audio-tactile model as a haptic and sound-based strategy for augmenting the perception and experience of PwVI in a heritage site. The first phase of the project was a multi-stage phenomenological experimental study with visually impaired users to investigate the design parameters for developing an audio-tactile model for PwVI. The findings from this phase included user preferences related to the physical design of the model such as the size, scale, materials, details, etc., and the information that it will carry such as braille, audio output, tactile text, etc. This was followed by the second phase in which a working prototype of an audio-tactile model is designed and developed for a heritage site based on the findings from the first phase of the study. A nationally listed heritage site from the author’s city was selected for making the model. The model was lastly tested by visually impaired users for final refinements and validation. The prototype developed empowers People with Vision Impairment to navigate independently in heritage sites. Such a model if installed in every heritage site, can serve as a technological guide for the Person with Vision Impairment, giving information of the architecture, details, planning & scale of the buildings, the entrances, location of important features, lifts, staircases, and available, accessible facilities. The model was constructed using 3D modeling and digital printing technology. Though designed for the Indian context, this assistive technology for the blind can be explored for wider applications across the globe. Such an accessible solution can change the otherwise “incomplete’’ perception of the disabled visitor, in this case, a visually impaired visitor and augment the quality of their experience in heritage sites.

Keywords: accessibility, architectural perception, audio tactile model , inclusive heritage, multi-sensory perception, visual impairment, visitor experience

Procedia PDF Downloads 94
8559 Control of a Quadcopter Using Genetic Algorithm Methods

Authors: Mostafa Mjahed

Abstract:

This paper concerns the control of a nonlinear system using two different methods, reference model and genetic algorithm. The quadcopter is a nonlinear unstable system, which is a part of aerial robots. It is constituted by four rotors placed at the end of a cross. The center of this cross is occupied by the control circuit. Its motions are governed by six degrees of freedom: three rotations around 3 axes (roll, pitch and yaw) and the three spatial translations. The control of such system is complex, because of nonlinearity of its dynamic representation and the number of parameters, which it involves. Numerous studies have been developed to model and stabilize such systems. The classical PID and LQ correction methods are widely used. If the latter represent the advantage to be simple because they are linear, they reveal the drawback to require the presence of a linear model to synthesize. It also implies the complexity of the established laws of command because the latter must be widened on all the domain of flight of these quadcopter. Note that, if the classical design methods are widely used to control aeronautical systems, the Artificial Intelligence methods as genetic algorithms technique receives little attention. In this paper, we suggest comparing two PID design methods. Firstly, the parameters of the PID are calculated according to the reference model. In a second phase, these parameters are established using genetic algorithms. By reference model, we mean that the corrected system behaves according to a reference system, imposed by some specifications: settling time, zero overshoot etc. Inspired from the natural evolution of Darwin's theory advocating the survival of the best, John Holland developed this evolutionary algorithm. Genetic algorithm (GA) possesses three basic operators: selection, crossover and mutation. We start iterations with an initial population. Each member of this population is evaluated through a fitness function. Our purpose is to correct the behavior of the quadcopter around three axes (roll, pitch and yaw) with 3 PD controllers. For the altitude, we adopt a PID controller.

Keywords: quadcopter, genetic algorithm, PID, fitness, model, control, nonlinear system

Procedia PDF Downloads 411
8558 The Impacts of Green Logistics Management Practices on Sustainability Performance in Nigeria

Authors: Ozoemelam Ikechukwu Lazarus, Nizamuddin B. Zainuddin, Abdul Kafi

Abstract:

Numerous studies have been carried out on Green Logistics Management Practices (GLMPs) across the globe. The study on the practices and performance of green chain practices in Africa in particular has not gained enough scholarly attention. Again, the majority of supply chain sustainability research being conducted focus on environmental sustainability. Logistics has been a major cause of supply chain resource waste and environmental damage. Many sectors of the economy that engage in logistical operations significantly rely on vehicles, which emit pollutants into the environment. Due to urbanization and industrialization, the logistical operations of manufacturing companies represent a serious hazard to the society and human life, making the sector one of the fastest expanding in the world today. Logistics companies are faced with numerous difficulties when attempting to implement logistics practices along their supply chains. In Nigeria, manufacturing companies aspire to implement reverse logistics in response to stakeholders’ requirements to reduce negative environmental consequences. However, implementing this is impeded by a criteria framework, and necessitates the careful analysis of how such criteria interact with each other in the presence of uncertainty. This study integrates most of the green logistics management practices (GLMPs) into the Nigerian firms to improve generalizability, and credibility. It examines the effect of Green Logistics Management Practices on environmental performance, social performance, market performance, and financial performance in the logistics industries. It seeks to identify the critical success factors in order to develop a model that incorporates different factors from the perspectives of the technology, organization, human and environment to inform the adoption and use of technologies for logistics supply chain social sustainability in Nigeria. It uses exploratory research approach to collect and analyse the data.

Keywords: logistics, managemernt, suatainability, environment, operations

Procedia PDF Downloads 36
8557 Experimental and Modal Determination of the State-Space Model Parameters of a Uni-Axial Shaker System for Virtual Vibration Testing

Authors: Jonathan Martino, Kristof Harri

Abstract:

In some cases, the increase in computing resources makes simulation methods more affordable. The increase in processing speed also allows real time analysis or even more rapid tests analysis offering a real tool for test prediction and design process optimization. Vibration tests are no exception to this trend. The so called ‘Virtual Vibration Testing’ offers solution among others to study the influence of specific loads, to better anticipate the boundary conditions between the exciter and the structure under test, to study the influence of small changes in the structure under test, etc. This article will first present a virtual vibration test modeling with a main focus on the shaker model and will afterwards present the experimental parameters determination. The classical way of modeling a shaker is to consider the shaker as a simple mechanical structure augmented by an electrical circuit that makes the shaker move. The shaker is modeled as a two or three degrees of freedom lumped parameters model while the electrical circuit takes the coil impedance and the dynamic back-electromagnetic force into account. The establishment of the equations of this model, describing the dynamics of the shaker, is presented in this article and is strongly related to the internal physical quantities of the shaker. Those quantities will be reduced into global parameters which will be estimated through experiments. Different experiments will be carried out in order to design an easy and practical method for the identification of the shaker parameters leading to a fully functional shaker model. An experimental modal analysis will also be carried out to extract the modal parameters of the shaker and to combine them with the electrical measurements. Finally, this article will conclude with an experimental validation of the model.

Keywords: lumped parameters model, shaker modeling, shaker parameters, state-space, virtual vibration

Procedia PDF Downloads 257
8556 The Future of Insurance: P2P Innovation versus Traditional Business Model

Authors: Ivan Sosa Gomez

Abstract:

Digitalization has impacted the entire insurance value chain, and the growing movement towards P2P platforms and the collaborative economy is also beginning to have a significant impact. P2P insurance is defined as innovation, enabling policyholders to pool their capital, self-organize, and self-manage their own insurance. In this context, new InsurTech start-ups are emerging as peer-to-peer (P2P) providers, based on a model that differs from traditional insurance. As a result, although P2P platforms do not change the fundamental basis of insurance, they do enable potentially more efficient business models to be established in terms of ensuring the coverage of risk. It is therefore relevant to determine whether p2p innovation can have substantial effects on the future of the insurance sector. For this purpose, it is considered necessary to develop P2P innovation from a business perspective, as well as to build a comparison between a traditional model and a P2P model from an actuarial perspective. Objectives: The objectives are (1) to represent P2P innovation in the business model compared to the traditional insurance model and (2) to establish a comparison between a traditional model and a P2P model from an actuarial perspective. Methodology: The research design is defined as action research in terms of understanding and solving the problems of a collectivity linked to an environment, applying theory and best practices according to the approach. For this purpose, the study is carried out through the participatory variant, which involves the collaboration of the participants, given that in this design, participants are considered experts. For this purpose, prolonged immersion in the field is carried out as the main instrument for data collection. Finally, an actuarial model is developed relating to the calculation of premiums that allows for the establishment of projections of future scenarios and the generation of conclusions between the two models. Main Contributions: From an actuarial and business perspective, we aim to contribute by developing a comparison of the two models in the coverage of risk in order to determine whether P2P innovation can have substantial effects on the future of the insurance sector.

Keywords: Insurtech, innovation, business model, P2P, insurance

Procedia PDF Downloads 77
8555 The Design of a Computer Simulator to Emulate Pathology Laboratories: A Model for Optimising Clinical Workflows

Authors: M. Patterson, R. Bond, K. Cowan, M. Mulvenna, C. Reid, F. McMahon, P. McGowan, H. Cormican

Abstract:

This paper outlines the design of a simulator to allow for the optimisation of clinical workflows through a pathology laboratory and to improve the laboratory’s efficiency in the processing, testing, and analysis of specimens. Often pathologists have difficulty in pinpointing and anticipating issues in the clinical workflow until tests are running late or in error. It can be difficult to pinpoint the cause and even more difficult to predict any issues which may arise. For example, they often have no indication of how many samples are going to be delivered to the laboratory that day or at a given hour. If we could model scenarios using past information and known variables, it would be possible for pathology laboratories to initiate resource preparations, e.g. the printing of specimen labels or to activate a sufficient number of technicians. This would expedite the clinical workload, clinical processes and improve the overall efficiency of the laboratory. The simulator design visualises the workflow of the laboratory, i.e. the clinical tests being ordered, the specimens arriving, current tests being performed, results being validated and reports being issued. The simulator depicts the movement of specimens through this process, as well as the number of specimens at each stage. This movement is visualised using an animated flow diagram that is updated in real time. A traffic light colour-coding system will be used to indicate the level of flow through each stage (green for normal flow, orange for slow flow, and red for critical flow). This would allow pathologists to clearly see where there are issues and bottlenecks in the process. Graphs would also be used to indicate the status of specimens at each stage of the process. For example, a graph could show the percentage of specimen tests that are on time, potentially late, running late and in error. Clicking on potentially late samples will display more detailed information about those samples, the tests that still need to be performed on them and their urgency level. This would allow any issues to be resolved quickly. In the case of potentially late samples, this could help to ensure that critically needed results are delivered on time. The simulator will be created as a single-page web application. Various web technologies will be used to create the flow diagram showing the workflow of the laboratory. JavaScript will be used to program the logic, animate the movement of samples through each of the stages and to generate the status graphs in real time. This live information will be extracted from an Oracle database. As well as being used in a real laboratory situation, the simulator could also be used for training purposes. ‘Bots’ would be used to control the flow of specimens through each step of the process. Like existing software agents technology, these bots would be configurable in order to simulate different situations, which may arise in a laboratory such as an emerging epidemic. The bots could then be turned on and off to allow trainees to complete the tasks required at that step of the process, for example validating test results.

Keywords: laboratory-process, optimization, pathology, computer simulation, workflow

Procedia PDF Downloads 274
8554 Human Factors Considerations in New Generation Fighter Planes to Enhance Combat Effectiveness

Authors: Chitra Rajagopal, Indra Deo Kumar, Ruchi Joshi, Binoy Bhargavan

Abstract:

Role of fighter planes in modern network centric military warfare scenarios has changed significantly in the recent past. New generation fighter planes have multirole capability of engaging both air and ground targets with high precision. Multirole aircraft undertakes missions such as Air to Air combat, Air defense, Air to Surface role (including Air interdiction, Close air support, Maritime attack, Suppression and Destruction of enemy air defense), Reconnaissance, Electronic warfare missions, etc. Designers have primarily focused on development of technologies to enhance the combat performance of the fighter planes and very little attention is given to human factor aspects of technologies. Unique physical and psychological challenges are imposed on the pilots to meet operational requirements during these missions. Newly evolved technologies have enhanced aircraft performance in terms of its speed, firepower, stealth, electronic warfare, situational awareness, and vulnerability reduction capabilities. This paper highlights the impact of emerging technologies on human factors for various military operations and missions. Technologies such as ‘cooperative knowledge-based systems’ to aid pilot’s decision making in military conflict scenarios as well as simulation technologies to enhance human performance is also studied as a part of research work. Current and emerging pilot protection technologies and systems which form part of the integrated life support systems in new generation fighter planes is discussed. System safety analysis application to quantify the human reliability in military operations is also studied.

Keywords: combat effectiveness, emerging technologies, human factors, systems safety analysis

Procedia PDF Downloads 132
8553 Impact of Heat Moisture Treatment on the Yield of Resistant Starch and Evaluation of Functional Properties of Modified Mung Bean (Vigna radiate) Starch

Authors: Sreejani Barua, P. P. Srivastav

Abstract:

Formulation of new functional food products for diabetes patients and obsessed people is a challenge for food industries till date. Starch is a certainly happening, ecological, reasonable and profusely obtainable polysaccharide in plant material. In the present scenario, there is a great interest in modifying starch functional properties without destroying its granular structure using different modification techniques. Resistant starch (RS) contains almost zero calories and can control blood glucose level to prevent diabetes. The current study focused on modification of mung bean starch which is a good source of legumes carbohydrate for the production of functional food. Heat moisture treatment (HMT) of mung starch was conducted at moisture content of 10-30%, temperature of 80-120 °C and time of 8-24 h.The content of resistant starch after modification was significantly increased from native starches containing RS 7.6%. The design combinations of HMT had been completed through Central Composite Rotatable Design (CCRD). The effects of HMT process variables on the yield of resistant starch was studied through Rapid Surface Methodology (RSM). The highest increase of resistant starch was found up to 34.39% when treated the native starch with 30% m.c at 120 °C temperature for 24 h.The functional properties of both native and modified mung bean starches showed that there was a reduction in the swelling power and swelling volume of HMT starches. However, the solubility of the HMT starches was higher than that of untreated native starch and also observed change in structural (scanning electron microscopy), X-Ray diffraction (XRD) pattern, blue value and thermal (differential scanning calorimetry) properties. Therefore, replacing native mung bean starch with heat-moisture treated mung bean starch leads to the development of new products with higher resistant starch levels and functional properties.

Keywords: Mung bean starch, heat moisture treatment, functional properties, resistant starch

Procedia PDF Downloads 191
8552 Optimization of Ultrasound Assisted Extraction of Polysaccharides from Plant Waste Materials: Selected Model Material is Hazelnut Skin

Authors: T. Yılmaz, Ş. Tavman

Abstract:

In this study, optimization of ultrasound assisted extraction (UAE) of hemicellulose based polysaccharides from plant waste material has been studied. Selected material is hazelnut skin. Extraction variables for the operation are extraction time, amplitude and application temperature. Optimum conditions have been evaluated depending on responses such as amount of wet crude polysaccharide, total carbohydrate content and dried sample. Pretreated hazelnut skin powders were used for the experiments. 10 grams of samples were suspended in 100 ml water in a jacketed vessel with additional magnetic stirring. Mixture was sonicated by immersing ultrasonic probe processor. After the extraction procedures, ethanol soluble and insoluble sides were separated for further examinations. The obtained experimental data were analyzed by analysis of variance (ANOVA). Second order polynomial models were developed using multiple regression analysis. The individual and interactive effects of applied variables were evaluated by Box Behnken Design. The models developed from the experimental design were predictive and good fit with the experimental data with high correlation coefficient value (R2 more than 0.95). Extracted polysaccharides from hazelnut skin are assumed to be pectic polysaccharides according to the literature survey of Fourier Transform Spectrometry (FTIR) analysis results. No more change can be observed between spectrums of different sonication times. Application of UAE at optimized condition has an important effect on extraction of hemicellulose from plant material by satisfying partial hydrolysis to break the bounds with other components in plant cell wall material. This effect can be summarized by varied intensity of microjets and microstreaming at varied sonication conditions.

Keywords: hazelnut skin, optimization, polysaccharide, ultrasound assisted extraction

Procedia PDF Downloads 318
8551 A 0-1 Goal Programming Approach to Optimize the Layout of Hospital Units: A Case Study in an Emergency Department in Seoul

Authors: Farhood Rismanchian, Seong Hyeon Park, Young Hoon Lee

Abstract:

This paper proposes a method to optimize the layout of an emergency department (ED) based on real executions of care processes by considering several planning objectives simultaneously. Recently, demand for healthcare services has been dramatically increased. As the demand for healthcare services increases, so do the need for new healthcare buildings as well as the need for redesign and renovating existing ones. The importance of implementation of a standard set of engineering facilities planning and design techniques has been already proved in both manufacturing and service industry with many significant functional efficiencies. However, high complexity of care processes remains a major challenge to apply these methods in healthcare environments. Process mining techniques applied in this study to tackle the problem of complexity and to enhance care process analysis. Process related information such as clinical pathways extracted from the information system of an ED. A 0-1 goal programming approach is then proposed to find a single layout that simultaneously satisfies several goals. The proposed model solved by optimization software CPLEX 12. The solution reached using the proposed method has 42.2% improvement in terms of walking distance of normal patients and 47.6% improvement in walking distance of critical patients at minimum cost of relocation. It has been observed that lots of patients must unnecessarily walk long distances during their visit to the emergency department because of an inefficient design. A carefully designed layout can significantly decrease patient walking distance and related complications.

Keywords: healthcare operation management, goal programming, facility layout problem, process mining, clinical processes

Procedia PDF Downloads 273
8550 Effect of Volute Tongue Shape and Position on Performance of Turbo Machinery Compressor

Authors: Anuj Srivastava, Kuldeep Kumar

Abstract:

This paper proposes a numerical study of volute tongue design, which affects the centrifugal compressor operating range and pressure recovery. Increased efficiency has been the traditional importance of compressor design. However, the increased operating range has become important in an age of ever-increasing productivity and energy costs in the turbomachinery industry. Efficiency and overall operating range are the two most important parameters studied to evaluate the aerodynamic performance of centrifugal compressor. Volute is one of the components that have significant effect on these two parameters. Choice of volute tongue geometry has major role in compressor performance, also affects performance map. The author evaluates the trade-off on using pull-back tongue geometry on centrifugal compressor performance. In present paper, three different tongue positions and shapes are discussed. These designs are compared in terms of pressure recovery coefficient, pressure loss coefficient, and stable operating range. The detailed flow structures for various volute geometries and pull back angle near tongue are studied extensively to explore the fluid behavior. The viscous Navier-Stokes equations are used to simulate the flow inside the volute. The numerical calculations are compared with thermodynamic 1-D calculations. Author concludes that the increment in compression ratio accompanies with more uniform pressure distribution in the modified tongue shape and location, a uniform static pressure around the circumferential which build a more uniform flow in the impeller and diffuser. Also, the blockage at the tongue of the volute was causing circumferentially nonuniformed pressure along the volute. This nonuniformity may lead impeller and diffuser to operate unstably. However, it is not the volute that directly controls the stall.

Keywords: centrifugal compressor volute, tongue geometry, pull-back, compressor performance, flow instability

Procedia PDF Downloads 150
8549 Value Index, a Novel Decision Making Approach for Waste Load Allocation

Authors: E. Feizi Ashtiani, S. Jamshidi, M.H Niksokhan, A. Feizi Ashtiani

Abstract:

Waste load allocation (WLA) policies may use multi-objective optimization methods to find the most appropriate and sustainable solutions. These usually intend to simultaneously minimize two criteria, total abatement costs (TC) and environmental violations (EV). If other criteria, such as inequity, need for minimization as well, it requires introducing more binary optimizations through different scenarios. In order to reduce the calculation steps, this study presents value index as an innovative decision making approach. Since the value index contains both the environmental violation and treatment costs, it can be maximized simultaneously with the equity index. It implies that the definition of different scenarios for environmental violations is no longer required. Furthermore, the solution is not necessarily the point with minimized total costs or environmental violations. This idea is testified for Haraz River, in north of Iran. Here, the dissolved oxygen (DO) level of river is simulated by Streeter-Phelps equation in MATLAB software. The WLA is determined for fish farms using multi-objective particle swarm optimization (MOPSO) in two scenarios. At first, the trade-off curves of TC-EV and TC-Inequity are plotted separately as the conventional approach. In the second, the Value-Equity curve is derived. The comparative results show that the solutions are in a similar range of inequity with lower total costs. This is due to the freedom of environmental violation attained in value index. As a result, the conventional approach can well be replaced by the value index particularly for problems optimizing these objectives. This reduces the process to achieve the best solutions and may find better classification for scenario definition. It is also concluded that decision makers are better to focus on value index and weighting its contents to find the most sustainable alternatives based on their requirements.

Keywords: waste load allocation (WLA), value index, multi objective particle swarm optimization (MOPSO), Haraz River, equity

Procedia PDF Downloads 408
8548 Low Enrollment in Civil Engineering Departments: Challenges and Opportunities

Authors: Alaa Yehia, Ayatollah Yehia, Sherif Yehia

Abstract:

There is a recurring issue of low enrollments across many civil engineering departments in postsecondary institutions. While there have been moments where enrollments begin to increase, civil engineering departments find themselves facing low enrollments at around 60% over the last five years across the Middle East. There are many reasons that could be attributed to this decline, such as low entry-level salaries, over-saturation of civil engineering graduates in the job market, and a lack of construction projects due to the impending or current recession. However, this recurring problem alludes to an intrinsic issue of the curriculum. The societal shift to the usage of high technology such as machine learning (ML) and artificial intelligence (AI) demands individuals who are proficient at utilizing it. Therefore, existing curriculums must adapt to this change in order to provide an education that is suitable for potential and current students. In this paper, In order to provide potential solutions for this issue, the analysis considers two possible implementations of high technology into the civil engineering curriculum. The first approach is to implement a course that introduces applications of high technology in Civil Engineering contexts. While the other approach is to intertwine applications of high technology throughout the degree. Both approaches, however, should meet requirements of accreditation agencies. In addition to the proposed improvement in civil engineering curriculum, a different pedagogical practice must be adapted as well. The passive learning approach might not be appropriate for Gen Z students; current students, now more than ever, need to be introduced to engineering topics and practice following different learning methods to ensure they will have the necessary skills for the job market. Different learning methods that incorporate high technology applications, like AI, must be integrated throughout the curriculum to make the civil engineering degree more attractive to prospective students. Moreover, the paper provides insight on the importance and approach of adapting the Civil Engineering curriculum to address the current low enrollment crisis that civil engineering departments globally, but specifically in the Middle East, are facing.

Keywords: artificial intelligence (AI), civil engineering curriculum, high technology, low enrollment, pedagogy

Procedia PDF Downloads 145
8547 Effectiveness of Breathing Training Program on Quality of Life and Depression Among Hemodialysis Patients: Quasi‐Experimental Study

Authors: Hayfa Almutary, Noof Eid Al Shammari

Abstract:

Aim: The management of depression in patients undergoing hemodialysis remains challenging. The aim of this study was to evaluate the effectiveness of a breathing training program on quality of life and depression among patients on hemodialysis. Design: A one-group pretest-posttest quasi-experimental design was used. Methods: Data were collected from hemodialysis units at three dialysis centers. Initial baseline data were collected, and a breathing training program was implemented. The breathing training program included three types of breathing exercises. The impact of the intervention on outcomes was measured using both the Kidney Disease Quality of Life Short Version and the Beck Depression Inventory-Second Edition from the same participants. The participants were asked to perform the breathing training program three times a day for 30 days. Results: The mean age of the patients was 52.1 (SD:15.0), with nearly two-thirds of them being male (63.4%). Participants who were undergoing hemodialysis for 1–4 years constituted the largest number of the sample (46.3%), and 17.1% of participants had visited a psychiatric clinic 1-3 times. The results show that the breathing training program improved overall quality of life and reduced symptoms and problems. In addition, a significant decrease in the overall depression score was observed after implementing the intervention. Conclusions: The breathing training program is a non-pharmacological intervention that has proven visible effectiveness in hemodialysis. This study demonstrated that using breathing exercises reduced depression levels and improved quality of life. The integration of this intervention in dialysis units to manage psychological issues will offer a simple, safe, easy, and inexpensive intervention. Future research should compare the effectiveness of various breathing exercises in hemodialysis patients using longitudinal studies. Impact: As a safety precaution, nurses should initially use non-pharmacological interventions, such as a breathing training program, to treat depression in those undergoing hemodialysis.

Keywords: breathing training program, depression, exercise, quality of life, hemodialysis

Procedia PDF Downloads 71
8546 Review on the Role of Sustainability Techniques in Development of Green Building

Authors: Ubaid Ur Rahman, Waqar Younas, Sooraj Kumar Chhabira

Abstract:

Environmentally sustainable building construction has experienced significant growth during the past 10 years at international level. This paper shows that the conceptual framework adopts sustainability techniques in construction to develop environment friendly building called green building. Waste occurs during the different construction phases which causes the environmental problems like, deposition of waste on ground surface creates major problems such as bad smell. It also gives birth to different health diseases and produces toxic waste agent which is specifically responsible for making soil infertile. Old recycled building material is used in the construction of new building. Sustainable construction is economical and saves energy sources. Sustainable construction is the major responsibility of designer and project manager. The designer has to fulfil the client demands while keeping the design environment friendly. Project manager has to deliver and execute sustainable construction according to sustainable design. Steel is the most appropriate sustainable construction material. It is more durable and easily recyclable. Steel occupies less area and has more tensile and compressive strength than concrete, making it a better option for sustainable construction as compared to other building materials. New technology like green roof has made the environment pleasant, and has reduced the construction cost. It minimizes economic, social and environmental issues. This paper presents an overview of research related to the material use of green building and by using this research recommendation are made which can be followed in the construction industry. In this paper, we go through detailed analysis on construction material. By making suitable adjustments to project management practices it is shown that a green building improves the cost efficiency of the project, makes it environmental friendly and also meets future generation demands.

Keywords: sustainable construction, green building, recycled waste material, environment

Procedia PDF Downloads 230
8545 Design of a Standard Weather Data Acquisition Device for the Federal University of Technology, Akure Nigeria

Authors: Isaac Kayode Ogunlade

Abstract:

Data acquisition (DAQ) is the process by which physical phenomena from the real world are transformed into an electrical signal(s) that are measured and converted into a digital format for processing, analysis, and storage by a computer. The DAQ is designed using PIC18F4550 microcontroller, communicating with Personal Computer (PC) through USB (Universal Serial Bus). The research deployed initial knowledge of data acquisition system and embedded system to develop a weather data acquisition device using LM35 sensor to measure weather parameters and the use of Artificial Intelligence(Artificial Neural Network - ANN)and statistical approach(Autoregressive Integrated Moving Average – ARIMA) to predict precipitation (rainfall). The device is placed by a standard device in the Department of Meteorology, Federal University of Technology, Akure (FUTA) to know the performance evaluation of the device. Both devices (standard and designed) were subjected to 180 days with the same atmospheric condition for data mining (temperature, relative humidity, and pressure). The acquired data is trained in MATLAB R2012b environment using ANN, and ARIMAto predict precipitation (rainfall). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Correction Square (R2), and Mean Percentage Error (MPE) was deplored as standardize evaluation to know the performance of the models in the prediction of precipitation. The results from the working of the developed device show that the device has an efficiency of 96% and is also compatible with Personal Computer (PC) and laptops. The simulation result for acquired data shows that ANN models precipitation (rainfall) prediction for two months (May and June 2017) revealed a disparity error of 1.59%; while ARIMA is 2.63%, respectively. The device will be useful in research, practical laboratories, and industrial environments.

Keywords: data acquisition system, design device, weather development, predict precipitation and (FUTA) standard device

Procedia PDF Downloads 76
8544 Molecular Design and Synthesis of Heterocycles Based Anticancer Agents

Authors: Amna J. Ghith, Khaled Abu Zid, Khairia Youssef, Nasser Saad

Abstract:

Backgrounds: The multikinase and vascular endothelial growth factor (VEGF) receptor inhibitors interrupt the pathway by which angiogenesis becomes established and promulgated, resulting in the inadequate nourishment of metastatic disease. VEGFR-2 has been the principal target of anti-angiogenic therapies. We disclose the new thieno pyrimidines as inhibitors of VEGFR-2 designed by a molecular modeling approach with increased synergistic activity and decreased side effects. Purpose: 2-substituted thieno pyrimidines are designed and synthesized with anticipated anticancer activity based on its in silico molecular docking study that supports the initial pharmacophoric hypothesis with a same binding mode of interaction at the ATP-binding site of VEGFR-2 (PDB 2QU5) with high docking score. Methods: A series of compounds were designed using discovery studio 4.1/CDOCKER with a rational that mimic the pharmacophoric features present in the reported active compounds that targeted VEGFR-2. An in silico ADMET study was also performed to validate the bioavailability of the newly designed compounds. Results: The Compounds to be synthesized showed interaction energy comparable to or within the range of the benzimidazole inhibitor ligand when docked with VEGFR-2. ADMET study showed comparable results most of the compounds showed absorption within (95-99) zone varying according to different substitutions attached to thieno pyrimidine ring system. Conclusions: A series of 2-subsituted thienopyrimidines are to be synthesized with anticipated anticancer activity and according to docking study structure requirement for the design of VEGFR-2 inhibitors which can act as powerful anticancer agents.

Keywords: docking, discovery studio 4.1/CDOCKER, heterocycles based anticancer agents, 2-subsituted thienopyrimidines

Procedia PDF Downloads 227
8543 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins

Authors: Manju Kanu, Subrata Sinha, Surabhi Johari

Abstract:

Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.

Keywords: epitope, b cell, immunogenicity, ebola

Procedia PDF Downloads 298
8542 An Approach in Design of Large-Scale Hydrogen Plants

Authors: Hamidreza Sahaleh

Abstract:

Because of the stringent prerequisite of low sulfur and heavier raw oil feedstock more hydrogen will be devoured in the refineries. Specifically if huge scale limits are the reaction to an expanded hydrogen request, certain configuration and building background are obliged with, which will be depicted in this paper with an illustration. Chosen procedure plan prerequisite will be recorded and portrayed in agreement to the flowsheet. Also, a determination of imaginative outline elements, similar to process condensate reuse, safe reformer start up and prerequisites will be highlighted.

Keywords: low sulfur, raw oil, refineries, flowsheet

Procedia PDF Downloads 274
8541 Strategies for Public Space Utilization

Authors: Ben Levenger

Abstract:

Social life revolves around a central meeting place or gathering space. It is where the community integrates, earns social skills, and ultimately becomes part of the community. Following this premise, public spaces are one of the most important spaces that downtowns offer, providing locations for people to be witnessed, heard, and most importantly, seamlessly integrate into the downtown as part of the community. To facilitate this, these local spaces must be envisioned and designed to meet the changing needs of a downtown, offering a space and purpose for everyone. This paper will dive deep into analyzing, designing, and implementing public space design for small plazas or gathering spaces. These spaces often require a detailed level of study, followed by a broad stroke of design implementation, allowing for adaptability. This paper will highlight how to assess needs, define needed types of spaces, outline a program for spaces, detail elements of design to meet the needs, assess your new space, and plan for change. This study will provide participants with the necessary framework for conducting a grass-roots-level assessment of public space and programming, including short-term and long-term improvements. Participants will also receive assessment tools, sheets, and visual representation diagrams. Urbanism, for the sake of urbanism, is an exercise in aesthetic beauty. An economic improvement or benefit must be attained to solidify these efforts' purpose further and justify the infrastructure or construction costs. We will deep dive into case studies highlighting economic impacts to ground this work in quantitative impacts. These case studies will highlight the financial impact on an area, measuring the following metrics: rental rates (per sq meter), tax revenue generation (sales and property), foot traffic generation, increased property valuations, currency expenditure by tenure, clustered development improvements, cost/valuation benefits of increased density in housing. The economic impact results will be targeted by community size, measuring in three tiers: Sub 10,000 in population, 10,001 to 75,000 in population, and 75,000+ in population. Through this classification breakdown, the participants can gauge the impact in communities similar to their work or for which they are responsible. Finally, a detailed analysis of specific urbanism enhancements, such as plazas, on-street dining, pedestrian malls, etc., will be discussed. Metrics that document the economic impact of each enhancement will be presented, aiding in the prioritization of improvements for each community. All materials, documents, and information will be available to participants via Google Drive. They are welcome to download the data and use it for their purposes.

Keywords: downtown, economic development, planning, strategic

Procedia PDF Downloads 60