Search results for: temporary architecture
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2007

Search results for: temporary architecture

777 Determinants of Long Acting Reversible Contraception Utilization among Women (15-49) in Uganda: Analysis of 2016 PMA2020 Uganda Survey

Authors: Nulu Nanono

Abstract:

Background: The Ugandan national health policy and the national population policy all recognize the need to increase access to quality, affordable, acceptable and sustainable contraceptive services for all people but provision and utilization of quality services remains low. Two contraceptive methods are categorized as long-acting temporary methods: intrauterine contraceptive devices (IUCDs) and implants. Copper-containing IUCDs, generally available in Ministry of Health (MoH) family planning programs and is effective for at least 12 years while Implants, depending on the type, last for up to three to seven years. Uganda’s current policy and political environment are favorable towards achieving national access to quality and safe contraceptives for all people as evidenced by increasing government commitments and innovative family planning programs. Despite the increase of modern contraception use from 14% to 26%, long acting reversible contraceptive (LARC) utilization has relatively remained low with less than 5% using IUDs & Implants which in a way explains Uganda’s persistent high fertility rates. Main question/hypothesis: The purpose of the study was to examine relationship between the demographic, socio-economic characteristics of women, health facility factors and long acting reversible contraception utilization. Methodology: LARC utilization was investigated comprising of the two questions namely are you or your partner currently doing something or using any method to delay or avoid getting pregnant? And which method or methods are you using? Data for the study was sourced from the 2016 Uganda Performance Monitoring and Accountability 2020 Survey comprising of 3816 female respondents aged 15 to 49 years. The analysis was done using the Chi-squared tests and the probit regression at bivariate and multivariate levels respectively. The model was further tested for validity and normality of the residuals using the Sharipo wilks test and test for kurtosis and skewness. Results: The results showed the model the age, parity, marital status, region, knowledge of LARCs, availability of LARCs to be significantly associated with long acting contraceptive utilization with p value of less than 0.05. At the multivariate analysis level, women who had higher parities (0.000) tertiary education (0.013), no knowledge about LARCs (0.006) increases their probability of using LARCs. Furthermore while women age 45-49, those who live in the eastern region reduces their probability of using LARCs. Knowledge contribution: The findings of this study join the debate of prior research in this field and add to the body of knowledge related to long acting reversible contraception. An outstanding and queer finding from the study is the non-utilization of LARCs by women who are aware and have knowledge about them, this may be an opportunity for further research to investigate the attribution to this.

Keywords: contraception, long acting, utilization, women (15-49)

Procedia PDF Downloads 196
776 Finding the Association Rule between Nursing Interventions and Early Evaluation Results of In-Hospital Cardiac Arrest to Improve Patient Safety

Authors: Wei-Chih Huang, Pei-Lung Chung, Ching-Heng Lin, Hsuan-Chia Yang, Der-Ming Liou

Abstract:

Background: In-Hospital Cardiac Arrest (IHCA) threaten life of the inpatients, cause serious effect to patient safety, quality of inpatients care and hospital service. Health providers must identify the signs of IHCA early to avoid the occurrence of IHCA. This study will consider the potential association between early signs of IHCA and the essence of patient care provided by nurses and other professionals before an IHCA occurs. The aim of this study is to identify significant associations between nursing interventions and abnormal early evaluation results of IHCA that can assist health care providers in monitoring inpatients at risk of IHCA to increase opportunities of IHCA early detection and prevention. Materials and Methods: This study used one of the data mining techniques called association rules mining to compute associations between nursing interventions and abnormal early evaluation results of IHCA. The nursing interventions and abnormal early evaluation results of IHCA were considered to be co-occurring if nursing interventions were provided within 24 hours of last being observed in abnormal early evaluation results of IHCA. The rule based methods were utilized 23.6 million electronic medical records (EMR) from a medical center in Taipei, Taiwan. This dataset includes 733 concepts of nursing interventions that coded by clinical care classification (CCC) codes and 13 early evaluation results of IHCA with binary codes. The values of interestingness and lift were computed as Q values to measure the co-occurrence and associations’ strength between all in-hospital patient care measures and abnormal early evaluation results of IHCA. The associations were evaluated by comparing the results of Q values and verified by medical experts. Results and Conclusions: The results show that there are 4195 pairs of associations between nursing interventions and abnormal early evaluation results of IHCA with their Q values. The indication of positive association is 203 pairs with Q values greater than 5. Inpatients with high blood sugar level (hyperglycemia) have positive association with having heart rate lower than 50 beats per minute or higher than 120 beats per minute, Q value is 6.636. Inpatients with temporary pacemaker (TPM) have significant association with high risk of IHCA, Q value is 47.403. There is significant positive correlation between inpatients with hypovolemia and happened abnormal heart rhythms (arrhythmias), Q value is 127.49. The results of this study can help to prevent IHCA from occurring by making health care providers early recognition of inpatients at risk of IHCA, assist with monitoring patients for providing quality of care to patients, improve IHCA surveillance and quality of in-hospital care.

Keywords: in-hospital cardiac arrest, patient safety, nursing intervention, association rule mining

Procedia PDF Downloads 267
775 Margin-Based Feed-Forward Neural Network Classifiers

Authors: Xiaohan Bookman, Xiaoyan Zhu

Abstract:

Margin-Based Principle has been proposed for a long time, it has been proved that this principle could reduce the structural risk and improve the performance in both theoretical and practical aspects. Meanwhile, feed-forward neural network is a traditional classifier, which is very hot at present with a deeper architecture. However, the training algorithm of feed-forward neural network is developed and generated from Widrow-Hoff Principle that means to minimize the squared error. In this paper, we propose a new training algorithm for feed-forward neural networks based on Margin-Based Principle, which could effectively promote the accuracy and generalization ability of neural network classifiers with less labeled samples and flexible network. We have conducted experiments on four UCI open data sets and achieved good results as expected. In conclusion, our model could handle more sparse labeled and more high-dimension data set in a high accuracy while modification from old ANN method to our method is easy and almost free of work.

Keywords: Max-Margin Principle, Feed-Forward Neural Network, classifier, structural risk

Procedia PDF Downloads 338
774 Installing Cloud Computing Model for E-Businesses in Small Organizations

Authors: Khader Titi

Abstract:

Information technology developments have changed the way how businesses are working. Organizations are required to become visible online and stay connected to take advantages of costs reduction and improved operation of existing resources. The approval and the application areas of the cloud computing has significantly increased since it was presented by Google in 2007. Internet Cloud computing has attracted the IT enterprise attention especially the e-business enterprise. At this time, there is a great issue of environmental costs during the enterprises apply the e- business, but with the coming of cloud computing, most of the problem will be solved. Organizations around the world are facing with the continued budget challenges and increasing in the size of their computational data so, they need to find a way to deliver their services to clients as economically as possible without negotiating the achievement of anticipated outcomes. E- business companies need to provide better services to satisfy their clients. In this research, the researcher proposed a paradigm that use and deploy cloud computing technology environment to be used for e-business in small enterprises. Cloud computing might be a suitable model for implementing e-business and e-commerce architecture to improve efficiency and user satisfaction.

Keywords: E-commerce, cloud computing, B2C, SaaS

Procedia PDF Downloads 313
773 Social Mobility and Urbanization: Case Study of Well-Educated Urban Migrant's Life Experience in the Era of China's New Urbanization Project

Authors: Xu Heng

Abstract:

Since the financial crisis of 2008 and the resulting Great Recession, the number of China’s unemployed college graduate reached over 500 thousand in 2011. Following the severe situation of college graduate employment, there has been growing public concern about college graduates, especially those with the less-privileged background, and their working and living condition in metropolises. Previous studies indicate that well-educated urban migrants with less-privileged background tend to obtain temporary occupation with less financial income and lower social status. Those vulnerable young migrants are described as ‘Ant Tribe’ by some scholars. However, since the implementation of a new urbanization project, together with the relaxed Hukou system and the acceleration of socio-economic development in middle/small cities, some researchers described well-educated urban migrant’s situation and the prospect of upward social mobility in urban areas in an overly optimistic light. In order to shed more lights on the underlying tensions encountered by China’s well-educated urban migrants in their upward social mobility pursuit, this research mainly focuses on 10 well-educated urban migrants’ life trajectories between their university-to-work transition and their current situation. All selected well-educated urban migrants are young adults with rural background who have already received higher education qualification from first-tier universities of Wuhan City (capital of Hubei Province). Drawing on the in-depth interviews with 10 participants and Inspired by Lahire’s Theory of Plural Actor, this study yields the following preliminary findings; 1) For those migrants who move to super-mega cities (i.e., Beijing, Shenzhen, Guangzhou) or stay in Wuhan after college graduation, their inadequacies of economic and social capital are the structural factors which negatively influence their living condition and further shape their plan for career development. The incompatibility between the sub-fields of urban life and the disposition, which generated from their early socialization, is the main cause for marginalized position in the metropolises. 2) For those migrants who move back to middle/small cities located in their hometown regions, the inconsistency between the disposition, which generated from college life, and the organizational habitus of the workplace is the main cause for their sense of ‘fish out of water’, even though they have obtained the stable occupation of local government or state-owned enterprise. On the whole, this research illuminates how the underlying the structural forces shape well-educated urban migrants’ life trajectories and hinder their upward social mobility under the context of new urbanization project.

Keywords: life trajectory, social mobility, urbanization, well-educated urban migrant

Procedia PDF Downloads 209
772 Distributional and Developmental Analysis of PM2.5 in Beijing, China

Authors: Alexander K. Guo

Abstract:

PM2.5 poses a large threat to people’s health and the environment and is an issue of large concern in Beijing, brought to the attention of the government by the media. In addition, both the United States Embassy in Beijing and the government of China have increased monitoring of PM2.5 in recent years, and have made real-time data available to the public. This report utilizes hourly historical data (2008-2016) from the U.S. Embassy in Beijing for the first time. The first objective was to attempt to fit probability distributions to the data to better predict a number of days exceeding the standard, and the second was to uncover any yearly, seasonal, monthly, daily, and hourly patterns and trends that may arise to better understand of air control policy. In these data, 66,650 hours and 2687 days provided valid data. Lognormal, gamma, and Weibull distributions were fit to the data through an estimation of parameters. The Chi-squared test was employed to compare the actual data with the fitted distributions. The data were used to uncover trends, patterns, and improvements in PM2.5 concentration over the period of time with valid data in addition to specific periods of time that received large amounts of media attention, analyzed to gain a better understanding of causes of air pollution. The data show a clear indication that Beijing’s air quality is unhealthy, with an average of 94.07µg/m3 across all 66,650 hours with valid data. It was found that no distribution fit the entire dataset of all 2687 days well, but each of the three above distribution types was optimal in at least one of the yearly data sets, with the lognormal distribution found to fit recent years better. An improvement in air quality beginning in 2014 was discovered, with the first five months of 2016 reporting an average PM2.5 concentration that is 23.8% lower than the average of the same period in all years, perhaps the result of various new pollution-control policies. It was also found that the winter and fall months contained more days in both good and extremely polluted categories, leading to a higher average but a comparable median in these months. Additionally, the evening hours, especially in the winter, reported much higher PM2.5 concentrations than the afternoon hours, possibly due to the prohibition of trucks in the city in the daytime and the increased use of coal for heating in the colder months when residents are home in the evening. Lastly, through analysis of special intervals that attracted media attention for either unnaturally good or bad air quality, the government’s temporary pollution control measures, such as more intensive road-space rationing and factory closures, are shown to be effective. In summary, air quality in Beijing is improving steadily and do follow standard probability distributions to an extent, but still needs improvement. Analysis will be updated when new data become available.

Keywords: Beijing, distribution, patterns, pm2.5, trends

Procedia PDF Downloads 242
771 Estimating Gait Parameter from Digital RGB Camera Using Real Time AlphaPose Learning Architecture

Authors: Murad Almadani, Khalil Abu-Hantash, Xinyu Wang, Herbert Jelinek, Kinda Khalaf

Abstract:

Gait analysis is used by healthcare professionals as a tool to gain a better understanding of the movement impairment and track progress. In most circumstances, monitoring patients in their real-life environments with low-cost equipment such as cameras and wearable sensors is more important. Inertial sensors, on the other hand, cannot provide enough information on angular dynamics. This research offers a method for tracking 2D joint coordinates using cutting-edge vision algorithms and a single RGB camera. We provide an end-to-end comprehensive deep learning pipeline for marker-less gait parameter estimation, which, to our knowledge, has never been done before. To make our pipeline function in real-time for real-world applications, we leverage the AlphaPose human posture prediction model and a deep learning transformer. We tested our approach on the well-known GPJATK dataset, which produces promising results.

Keywords: gait analysis, human pose estimation, deep learning, real time gait estimation, AlphaPose, transformer

Procedia PDF Downloads 113
770 Impact of Climatic Hazards on the Jamuna River Fisheries and Coping and Adaptation Strategies

Authors: Farah Islam, Md. Monirul Islam, Mosammat Salma Akter, Goutam Kumar Kundu

Abstract:

The continuous variability of climate and the risk associated with it have a significant impact on the fisheries leading to a global concern for about half a billion fishery-based livelihoods. Though in the context of Bangladesh mounting evidence on the impacts of climate change on fishery-based livelihoods or their socioeconomic conditions are present, the country’s inland fisheries sector remains in a negligible corner as compared to the coastal areas which are spotted on the highlight due to its higher vulnerability to climatic hazards. The available research on inland fisheries, particularly river fisheries, has focussed mainly on fish production, pollution, fishing gear, fish biodiversity and livelihoods of the fishers. This study assesses the impacts of climate variability and changes on the Jamuna (a transboundary river called Brahmaputra in India) River fishing communities and their coping and adaptation strategies. This study has used primary data collected from Kalitola Ghat and Debdanga fishing communities of the Jamuna River during May, August and December 2015 using semi-structured interviews, oral history interviews, key informant interviews, focus group discussions and impact matrix as well as secondary data. This study has found that both communities are exposed to storms, floods and land erosions which impact on fishery-based livelihood assets, strategies, and outcomes. The impact matrix shows that human and physical capitals are more affected by climate hazards which in turn affect financial capital. Both communities have been responding to these exposures through multiple coping and adaptation strategies. The coping strategies include making dam with soil, putting jute sac on the yard, taking shelter on boat or embankment, making raised platform or ‘Kheua’ and involving with temporary jobs. While, adaptation strategies include permanent migration, change of livelihood activities and strategies, changing fishing practices and making robust houses. The study shows that migration is the most common adaptation strategy for the fishers which resulted in mostly positive outcomes for the migrants. However, this migration has impacted negatively on the livelihoods of existing fishers in the communities. In sum, the Jamuna river fishing communities have been impacted by several climatic hazards and they have traditionally coped with or adapted to the impacts which are not sufficient to maintain sustainable livelihoods and fisheries. In coming decades, this situation may become worse as predicted by latest scientific research and an enhanced level of response would be needed.

Keywords: climatic hazards, impacts and adaptation, fisherfolk, the Jamuna River

Procedia PDF Downloads 310
769 Provenance in Scholarly Publications: Introducing the provCite Ontology

Authors: Maria Joseph Israel, Ahmed Amer

Abstract:

Our work aims to broaden the application of provenance technology beyond its traditional domains of scientific workflow management and database systems by offering a general provenance framework to capture richer and extensible metadata in unstructured textual data sources such as literary texts, commentaries, translations, and digital humanities. Specifically, we demonstrate the feasibility of capturing and representing expressive provenance metadata, including more of the context for citing scholarly works (e.g., the authors’ explicit or inferred intentions at the time of developing his/her research content for publication), while also supporting subsequent augmentation with similar additional metadata (by third parties, be they human or automated). To better capture the nature and types of possible citations, in our proposed provenance scheme metaScribe, we extend standard provenance conceptual models to form our proposed provCite ontology. This provides a conceptual framework which can accurately capture and describe more of the functional and rhetorical properties of a citation than can be achieved with any current models.

Keywords: knowledge representation, provenance architecture, ontology, metadata, bibliographic citation, semantic web annotation

Procedia PDF Downloads 112
768 Planning Fore Stress II: Study on Resiliency of New Architectural Patterns in Urban Scale

Authors: Amir Shouri, Fereshteh Tabe

Abstract:

Master planning and urban infrastructure’s thoughtful and sequential design strategies will play the major role in reducing the damages of natural disasters, war and or social/population related conflicts for cities. Defensive strategies have been revised during the history of mankind after having damages from natural depressions, war experiences and terrorist attacks on cities. Lessons learnt from Earthquakes, from 2 world war casualties in 20th century and terrorist activities of all times. Particularly, after Hurricane Sandy of New York in 2012 and September 11th attack on New York’s World Trade Centre (WTC) in 21st century, there have been series of serious collaborations between law making authorities, urban planners and architects and defence related organizations to firstly, getting prepared and/or prevent such activities and secondly, reduce the human loss and economic damages to minimum. This study will work on developing a model of planning for New York City, where its citizens will get minimum impacts in threat-full time with minimum economic damages to the city after the stress is passed. The main discussion in this proposal will focus on pre-hazard, hazard-time and post-hazard transformative policies and strategies that will reduce the “Life casualties” and will ease “Economic Recovery” in post-hazard conditions. This proposal is going to scrutinize that one of the key solutions in this path might be focusing on all overlaying possibilities on architectural platforms of three fundamental infrastructures, the transportation, the power related sources and defensive abilities on a dynamic-transformative framework that will provide maximum safety, high level of flexibility and fastest action-reaction opportunities in stressful periods of time. “Planning Fore Stress” is going to be done in an analytical, qualitative and quantitative work frame, where it will study cases from all over the world. Technology, Organic Design, Materiality, Urban forms, city politics and sustainability will be discussed in deferent cases in international scale. From the modern strategies of Copenhagen for living friendly with nature to traditional approaches of Indonesian old urban planning patterns, the “Iron Dome” of Israel to “Tunnels” in Gaza, from “Ultra-high-performance quartz-infused concrete” of Iran to peaceful and nature-friendly strategies of Switzerland, from “Urban Geopolitics” in cities, war and terrorism to “Design of Sustainable Cities” in the world, will all be studied with references and detailed look to analysis of each case in order to propose the most resourceful, practical and realistic solutions to questions on “New City Divisions”, “New City Planning and social activities” and “New Strategic Architecture for Safe Cities”. This study is a developed version of a proposal that was announced as winner at MoMA in 2013 in call for ideas for Rockaway after Sandy Hurricane took place.

Keywords: urban scale, city safety, natural disaster, war and terrorism, city divisions, architecture for safe cities

Procedia PDF Downloads 477
767 Test and Evaluation of Patient Tracking Platform in an Earthquake Simulation

Authors: Nahid Tavakoli, Mohammad H. Yarmohammadian, Ali Samimi

Abstract:

In earthquake situation, medical response communities such as field and referral hospitals are challenged with injured victims’ identification and tracking. In our project, it was developed a patient tracking platform (PTP) where first responders triage the patients with an electronic tag which report the location and some information of each patient during his/her movement. This platform includes: 1) near field communication (NFC) tags (ISO 14443), 2) smart mobile phones (Android-base version 4.2.2), 3) Base station laptops (Windows), 4) server software, 5) Android software to use by first responders, 5) disaster command software, and 6) system architecture. Our model has been completed through literature review, Delphi technique, focus group, design the platform, and implement in an earthquake exercise. This paper presents consideration for content, function, and technologies that must apply for patient tracking in medical emergencies situations. It is demonstrated the robustness of the patient tracking platform (PTP) in tracking 6 patients in a simulated earthquake situation in the yard of the relief and rescue department of Isfahan’s Red Crescent.

Keywords: test and evaluation, patient tracking platform, earthquake, simulation

Procedia PDF Downloads 136
766 A Survey on Traditional Mac Layer Protocols in Cognitive Wireless Mesh Networks

Authors: Anusha M., V. Srikanth

Abstract:

Maximizing spectrum usage and numerous applications of the wireless communication networks have forced to a high interest of available spectrum. Cognitive Radio control its receiver and transmitter features exactly so that they can utilize the vacant approved spectrum without impacting the functionality of the principal licensed users. The Use of various channels assists to address interferences thereby improves the whole network efficiency. The MAC protocol in cognitive radio network explains the spectrum usage by interacting with multiple channels among the users. In this paper we studied about the architecture of cognitive wireless mesh network and traditional TDMA dependent MAC method to allocate channels dynamically. The majority of the MAC protocols suggested in the research are operated on Common-Control-Channel (CCC) to handle the services between Cognitive Radio secondary users. In this paper, an extensive study of Multi-Channel Multi-Radios or frequency range channel allotment and continually synchronized TDMA scheduling are shown in summarized way.

Keywords: TDMA, MAC, multi-channel, multi-radio, WMN’S, cognitive radios

Procedia PDF Downloads 552
765 Modelling of Reactive Methodologies in Auto-Scaling Time-Sensitive Services With a MAPE-K Architecture

Authors: Óscar Muñoz Garrigós, José Manuel Bernabeu Aubán

Abstract:

Time-sensitive services are the base of the cloud services industry. Keeping low service saturation is essential for controlling response time. All auto-scalable services make use of reactive auto-scaling. However, reactive auto-scaling has few in-depth studies. This presentation shows a model for reactive auto-scaling methodologies with a MAPE-k architecture. Queuing theory can compute different properties of static services but lacks some parameters related to the transition between models. Our model uses queuing theory parameters to relate the transition between models. It associates MAPE-k related times, the sampling frequency, the cooldown period, the number of requests that an instance can handle per unit of time, the number of incoming requests at a time instant, and a function that describes the acceleration in the service's ability to handle more requests. This model is later used as a solution to horizontally auto-scale time-sensitive services composed of microservices, reevaluating the model’s parameters periodically to allocate resources. The solution requires limiting the acceleration of the growth in the number of incoming requests to keep a constrained response time. Business benefits determine such limits. The solution can add a dynamic number of instances and remains valid under different system sizes. The study includes performance recommendations to improve results according to the incoming load shape and business benefits. The exposed methodology is tested in a simulation. The simulator contains a load generator and a service composed of two microservices, where the frontend microservice depends on a backend microservice with a 1:1 request relation ratio. A common request takes 2.3 seconds to be computed by the service and is discarded if it takes more than 7 seconds. Both microservices contain a load balancer that assigns requests to the less loaded instance and preemptively discards requests if they are not finished in time to prevent resource saturation. When load decreases, instances with lower load are kept in the backlog where no more requests are assigned. If the load grows and an instance in the backlog is required, it returns to the running state, but if it finishes the computation of all requests and is no longer required, it is permanently deallocated. A few load patterns are required to represent the worst-case scenario for reactive systems: the following scenarios test response times, resource consumption and business costs. The first scenario is a burst-load scenario. All methodologies will discard requests if the rapidness of the burst is high enough. This scenario focuses on the number of discarded requests and the variance of the response time. The second scenario contains sudden load drops followed by bursts to observe how the methodology behaves when releasing resources that are lately required. The third scenario contains diverse growth accelerations in the number of incoming requests to observe how approaches that add a different number of instances can handle the load with less business cost. The exposed methodology is compared against a multiple threshold CPU methodology allocating/deallocating 10 or 20 instances, outperforming the competitor in all studied metrics.

Keywords: reactive auto-scaling, auto-scaling, microservices, cloud computing

Procedia PDF Downloads 88
764 A Critical Evaluation of Building Information Modelling in New Zealand: Deepening Our Understanding of the Benefits and Drawbacks

Authors: Garry Miller, Thomas Alexander, Cameron Lee

Abstract:

There is belief that Building Information Modelling (BIM) will improve performance of the New Zealand (NZ) Architecture, Engineering and Construction (AEC) sector, however, widespread use of BIM is yet to be seen. Previous research indicates there are many issues affecting the uptake of BIM in NZ; nevertheless the underlying benefits, drawbacks, and barriers preventing more widespread uptake are not fully understood. This investigation aimed to understand these factors more clearly and make suggestions on how to improve the uptake of BIM in NZ. Semi-structured interviews were conducted with a range of industry professionals to gather a qualitative understanding. Findings indicated the ability to incorporate better information into a BIM model could drive many benefits. However scepticism and lack of positive incentives in NZ are affecting its widespread use. This concluded that there is a need for the government to produce a number of BIM case studies and develop a set of BIM standards to resolve payment issues surrounding BIM use. This study provides useful information for those interested in BIM and members of government interested in improving the performance of the construction industry. This study may also be of interest to small, developed countries such as NZ where the level of BIM maturity is relatively low.

Keywords: BIM, New Zealand, AEC sector, building information modelling

Procedia PDF Downloads 513
763 A 5G Architecture Based to Dynamic Vehicular Clustering Enhancing VoD Services Over Vehicular Ad hoc Networks

Authors: Lamaa Sellami, Bechir Alaya

Abstract:

Nowadays, video-on-demand (VoD) applications are becoming one of the tendencies driving vehicular network users. In this paper, considering the unpredictable vehicle density, the unexpected acceleration or deceleration of the different cars included in the vehicular traffic load, and the limited radio range of the employed communication scheme, we introduce the “Dynamic Vehicular Clustering” (DVC) algorithm as a new scheme for video streaming systems over VANET. The proposed algorithm takes advantage of the concept of small cells and the introduction of wireless backhauls, inspired by the different features and the performance of the Long Term Evolution (LTE)- Advanced network. The proposed clustering algorithm considers multiple characteristics such as the vehicle’s position and acceleration to reduce latency and packet loss. Therefore, each cluster is counted as a small cell containing vehicular nodes and an access point that is elected regarding some particular specifications.

Keywords: video-on-demand, vehicular ad-hoc network, mobility, vehicular traffic load, small cell, wireless backhaul, LTE-advanced, latency, packet loss

Procedia PDF Downloads 134
762 A Comprehensive Metamodel of an Urbanized Information System: Experimental Case

Authors: Leila Trabelsi

Abstract:

The urbanization of Information Systems (IS) is an effective approach to master the complexity of the organization. It strengthens the coherence of IS and aligns it with the business strategy. Moreover, this approach has significant advantages such as reducing Information Technologies (IT) costs, enhancing the IS position in a competitive environment and ensuring the scalability of the IS through the integration of technological innovations. Therefore, the urbanization is considered as a business strategic decision. Thus, its embedding becomes a necessity in order to improve the IS practice. However, there is a lack of experimental cases studying meta-modelling of Urbanized Information System (UIS). The aim of this paper addresses new urbanization content meta-model which permits modelling, testing and taking into consideration organizational aspects. This methodological framework is structured according to two main abstraction levels, a conceptual level and an operational level. For each of these levels, different models are proposed and presented. The proposed model for has been empirically tested on company. The findings of this paper present an experimental study of urbanization meta-model. The paper points out the significant relationships between dimensions and their evolution.

Keywords: urbanization, information systems, enterprise architecture, meta-model

Procedia PDF Downloads 434
761 The Results of the Archaeological Excavations at the Site of Qurh in Al Ula Region

Authors: Ahmad Al Aboudi

Abstract:

The Department of Archaeology at King Saud University conduct a long Term excavations since 2004 at the archaeological site of (Qurh) in Al-Ula area. The history of the site goes back to the eighth century AD. The main aim of the excavations is the training of the students on the archaeological field work associated with the scientific skills of exploring, surveying, classifying, documentations and other necessary in the field archaeology. During the 12th Season of Excavations, an area of 20 × 40 m2 of the site was excavated. The depth of the excavating the site was reached to 2-3 m. Many of the architectural features of a residential area in the northern part of the site were excavated this season. Circular walls made of mud-brick and a brick column drums and tiles made of clay were revealed this season. Additionally, lots of findings such as Gemstones, jars, ceramic plates, metal, glass, and fabric, as well as some jewelers and coins were discovered. This paper will deal with the main results of this field project including the architectural features and phenomena and their interpretations, the classification of excavated material culture remains and stratigraphy.

Keywords: Islamic architecture, Islamic art, excavations, early Islamic city

Procedia PDF Downloads 268
760 Optimization and Simulation Models Applied in Engineering Planning and Management

Authors: Abiodun Ladanu Ajala, Wuyi Oke

Abstract:

Mathematical simulation and optimization models packaged within interactive computer programs provide a common way for planners and managers to predict the behaviour of any proposed water resources system design or management policy before it is implemented. Modeling presents a principal technique of predicting the behaviour of the proposed infrastructural designs or management policies. Models can be developed and used to help identify specific alternative plans that best meet those objectives. This study discusses various types of models, their development, architecture, data requirements, and applications in the field of engineering. It also outlines the advantages and limitations of each the optimization and simulation models presented. The techniques explored in this review include; dynamic programming, linear programming, fuzzy optimization, evolutionary algorithms and finally artificial intelligence techniques. Previous studies carried out using some of the techniques mentioned above were reviewed, and most of the results from different researches showed that indeed optimization and simulation provides viable alternatives and predictions which form a basis for decision making in building engineering structures and also in engineering planning and management.

Keywords: linear programming, mutation, optimization, simulation

Procedia PDF Downloads 583
759 Hepatoxicity induced Glyphosate-Based Herbicide Baron in albino rats

Authors: Manal E. A Elhalwagy, Nadia Amin Abdulmajeed, Hanan S. Alnahdi, Enas N. Danial

Abstract:

Baron is herbicide includes (48% glyphosate) widely used in Egypt. The present study assesses the cytotoxic and genotoxic effect of baron on rats liver. Two groups of rats were treated orally with 1/10 LD 50, (275.49 mg kg -1) and 1/40 LD 50, (68.86 mg kg-1) glyphosate for 28 days compared with control group. Serum and liver tissues were taken at 14 and 28 days of treatment. An inhibition in Alanine aminotransferase (ALT) and aspartate aminotransferase (AST) activities were recorded at both treatment periods and reduction in total serum protein (TP) and albumin (ALB). However, non-significant changes in serum acetylcholinesterase (AChE). Elevation in oxidative stress biomarker malondyaldehyde (MDA) and the decline in detoxification biomarker total reduced glutathione (GSH), Glutathione S-transferase (GST) and superoxide dismutase (SOD) in liver tissues led to increase in percentage of DNA damage. Destruction in liver tissue architecture was observed . Although, Baron was classified in the safe category pesticides repeated exposure to small doses has great danger effect.

Keywords: glyphosate, liver toxicity, oxidative stress, DNA damage, commet assay

Procedia PDF Downloads 374
758 Digital Platform of Crops for Smart Agriculture

Authors: Pascal François Faye, Baye Mor Sall, Bineta Dembele, Jeanne Ana Awa Faye

Abstract:

In agriculture, estimating crop yields is key to improving productivity and decision-making processes such as financial market forecasting and addressing food security issues. The main objective of this paper is to have tools to predict and improve the accuracy of crop yield forecasts using machine learning (ML) algorithms such as CART , KNN and SVM . We developed a mobile app and a web app that uses these algorithms for practical use by farmers. The tests show that our system (collection and deployment architecture, web application and mobile application) is operational and validates empirical knowledge on agro-climatic parameters in addition to proactive decision-making support. The experimental results obtained on the agricultural data, the performance of the ML algorithms are compared using cross-validation in order to identify the most effective ones following the agricultural data. The proposed applications demonstrate that the proposed approach is effective in predicting crop yields and provides timely and accurate responses to farmers for decision support.

Keywords: prediction, machine learning, artificial intelligence, digital agriculture

Procedia PDF Downloads 75
757 Transfer Learning for Protein Structure Classification at Low Resolution

Authors: Alexander Hudson, Shaogang Gong

Abstract:

Structure determination is key to understanding protein function at a molecular level. Whilst significant advances have been made in predicting structure and function from amino acid sequence, researchers must still rely on expensive, time-consuming analytical methods to visualise detailed protein conformation. In this study, we demonstrate that it is possible to make accurate (≥80%) predictions of protein class and architecture from structures determined at low (>3A) resolution, using a deep convolutional neural network trained on high-resolution (≤3A) structures represented as 2D matrices. Thus, we provide proof of concept for high-speed, low-cost protein structure classification at low resolution, and a basis for extension to prediction of function. We investigate the impact of the input representation on classification performance, showing that side-chain information may not be necessary for fine-grained structure predictions. Finally, we confirm that high resolution, low-resolution and NMR-determined structures inhabit a common feature space, and thus provide a theoretical foundation for boosting with single-image super-resolution.

Keywords: transfer learning, protein distance maps, protein structure classification, neural networks

Procedia PDF Downloads 130
756 Flexible Cities: A Multisided Spatial Application of Tracking Livability of Urban Environment

Authors: Maria Christofi, George Plastiras, Rafaella Elia, Vaggelis Tsiourtis, Theocharis Theocharides, Miltiadis Katsaros

Abstract:

The rapidly expanding urban areas of the world constitute a challenge of how we need to make the transition to "the next urbanization", which will be defined by new analytical tools and new sources of data. This paper is about the production of a spatial application, the ‘FUMapp’, where space and its initiative will be available literally, in meters, but also abstractly, at a sensed level. While existing spatial applications typically focus on illustrations of the urban infrastructure, the suggested application goes beyond the existing: It investigates how our environment's perception adapts to the alterations of the built environment through a dataset construction of biophysical measurements (eye-tracking, heart beating), and physical metrics (spatial characteristics, size of stimuli, rhythm of mobility). It explores the intersections between architecture, cognition, and computing where future design can be improved and identifies the flexibility and livability of the ‘available space’ of specific examined urban paths.

Keywords: biophysical data, flexibility of urban, livability, next urbanization, spatial application

Procedia PDF Downloads 139
755 Implementation and Demonstration of Software-Defined Traffic Grooming

Authors: Lei Guo, Xu Zhang, Weigang Hou

Abstract:

Since the traditional network is closed and it has no architecture to create applications, it has been unable to evolve with changing demands under the rapid innovation in services. Additionally, due to the lack of the whole network profile, the quality of service cannot be well guaranteed in the traditional network. The Software Defined Network (SDN) utilizes global resources to support on-demand applications/services via open, standardized and programmable interfaces. In this paper, we implement the traffic grooming application under a real SDN environment, and the corresponding analysis is made. In our SDN: 1) we use OpenFlow protocol to control the entire network by using software applications running on the network operating system; 2) several virtual switches are combined into the data forwarding plane through Open vSwitch; 3) An OpenFlow controller, NOX, is involved as a logically centralized control plane that dynamically configures the data forwarding plane; 4) The traffic grooming based on SDN is demonstrated through dynamically modifying the idle time of flow entries. The experimental results demonstrate that the SDN-based traffic grooming effectively reduces the end-to-end delay, and the improvement ratio arrives to 99%.

Keywords: NOX, OpenFlow, Software Defined Network (SDN), traffic grooming

Procedia PDF Downloads 247
754 ROOP: Translating Sequential Code Fragments to Distributed Code Fragments Using Deep Reinforcement Learning

Authors: Arun Sanjel, Greg Speegle

Abstract:

Every second, massive amounts of data are generated, and Data Intensive Scalable Computing (DISC) frameworks have evolved into effective tools for analyzing such massive amounts of data. Since the underlying architecture of these distributed computing platforms is often new to users, building a DISC application can often be time-consuming and prone to errors. The automated conversion of a sequential program to a DISC program will consequently significantly improve productivity. However, synthesizing a user’s intended program from an input specification is complex, with several important applications, such as distributed program synthesizing and code refactoring. Existing works such as Tyro and Casper rely entirely on deductive synthesis techniques or similar program synthesis approaches. Our approach is to develop a data-driven synthesis technique to identify sequential components and translate them to equivalent distributed operations. We emphasize using reinforcement learning and unit testing as feedback mechanisms to achieve our objectives.

Keywords: program synthesis, distributed computing, reinforcement learning, unit testing, DISC

Procedia PDF Downloads 95
753 Syllogistic Reasoning with 108 Inference Rules While Case Quantities Change

Authors: Mikhail Zarechnev, Bora I. Kumova

Abstract:

A syllogism is a deductive inference scheme used to derive a conclusion from a set of premises. In a categorical syllogisms, there are only two premises and every premise and conclusion is given in form of a quantified relationship between two objects. The different order of objects in premises give classification known as figures. We have shown that the ordered combinations of 3 generalized quantifiers with certain figure provide in total of 108 syllogistic moods which can be considered as different inference rules. The classical syllogistic system allows to model human thought and reasoning with syllogistic structures always attracted the attention of cognitive scientists. Since automated reasoning is considered as part of learning subsystem of AI agents, syllogistic system can be applied for this approach. Another application of syllogistic system is related to inference mechanisms on the Semantic Web applications. In this paper we proposed the mathematical model and algorithm for syllogistic reasoning. Also the model of iterative syllogistic reasoning in case of continuous flows of incoming data based on case–based reasoning and possible applications of proposed system were discussed.

Keywords: categorical syllogism, case-based reasoning, cognitive architecture, inference on the semantic web, syllogistic reasoning

Procedia PDF Downloads 409
752 Exploring the Motivations That Drive Paper Use in Clinical Practice Post-Electronic Health Record Adoption: A Nursing Perspective

Authors: Sinead Impey, Gaye Stephens, Lucy Hederman, Declan O'Sullivan

Abstract:

Continued paper use in the clinical area post-Electronic Health Record (EHR) adoption is regularly linked to hardware and software usability challenges. Although paper is used as a workaround to circumvent challenges, including limited availability of a computer, this perspective does not consider the important role paper, such as the nurses’ handover sheet, play in practice. The purpose of this study is to confirm the hypothesis that paper use post-EHR adoption continues as paper provides both a cognitive tool (that assists with workflow) and a compensation tool (to circumvent usability challenges). Distinguishing the different motivations for continued paper-use could assist future evaluations of electronic record systems. Methods: Qualitative data were collected from three clinical care environments (ICU, general ward and specialist day-care) who used an electronic record for at least 12 months. Data were collected through semi-structured interviews with 22 nurses. Data were transcribed, themes extracted using an inductive bottom-up coding approach and a thematic index constructed. Findings: All nurses interviewed continued to use paper post-EHR adoption. While two distinct motivations for paper use post-EHR adoption were confirmed by the data - paper as a cognitive tool and paper as a compensation tool - further finding was that there was an overlap between the two uses. That is, paper used as a compensation tool could also be adapted to function as a cognitive aid due to its nature (easy to access and annotate) or vice versa. Rather than present paper persistence as having two distinctive motivations, it is more useful to describe it as presenting on a continuum with compensation tool and cognitive tool at either pole. Paper as a cognitive tool referred to pages such as nurses’ handover sheet. These did not form part of the patient’s record, although information could be transcribed from one to the other. Findings suggest that although the patient record was digitised, handover sheets did not fall within this remit. These personal pages continued to be useful post-EHR adoption for capturing personal notes or patient information and so continued to be incorporated into the nurses’ work. Comparatively, the paper used as a compensation tool, such as pre-printed care plans which were stored in the patient's record, appears to have been instigated in reaction to usability challenges. In these instances, it is expected that paper use could reduce or cease when the underlying problem is addressed. There is a danger that as paper affords nurses a temporary information platform that is mobile, easy to access and annotate, its use could become embedded in clinical practice. Conclusion: Paper presents a utility to nursing, either as a cognitive or compensation tool or combination of both. By fully understanding its utility and nuances, organisations can avoid evaluating all incidences of paper use (post-EHR adoption) as arising from usability challenges. Instead, suitable remedies for paper-persistence can be targeted at the root cause.

Keywords: cognitive tool, compensation tool, electronic record, handover sheet, nurse, paper persistence

Procedia PDF Downloads 427
751 Self-Supervised Pretraining on Sequences of Functional Magnetic Resonance Imaging Data for Transfer Learning to Brain Decoding Tasks

Authors: Sean Paulsen, Michael Casey

Abstract:

In this work we present a self-supervised pretraining framework for transformers on functional Magnetic Resonance Imaging (fMRI) data. First, we pretrain our architecture on two self-supervised tasks simultaneously to teach the model a general understanding of the temporal and spatial dynamics of human auditory cortex during music listening. Our pretraining results are the first to suggest a synergistic effect of multitask training on fMRI data. Second, we finetune the pretrained models and train additional fresh models on a supervised fMRI classification task. We observe significantly improved accuracy on held-out runs with the finetuned models, which demonstrates the ability of our pretraining tasks to facilitate transfer learning. This work contributes to the growing body of literature on transformer architectures for pretraining and transfer learning with fMRI data, and serves as a proof of concept for our pretraining tasks and multitask pretraining on fMRI data.

Keywords: transfer learning, fMRI, self-supervised, brain decoding, transformer, multitask training

Procedia PDF Downloads 85
750 Anomalous Behaviors of Visible Luminescence from Graphene Quantum Dots

Authors: Hyunho Shin, Jaekwang Jung, Jeongho Park, Sungwon Hwang

Abstract:

For the application of graphene quantum dots (GQDs) to optoelectronic nanodevices, it is of critical importance to understand the mechanisms which result in novel phenomena of their light absorption/emission. The optical transitions are known to be available up to ~6 eV in GQDs, especially useful for ultraviolet (UV) photodetectors (PDs). Here, we present size-dependent shape/edge-state variations of GQDs and visible photoluminescence (PL) showing anomalous size dependencies. With varying the average size (da) of GQDs from 5 to 35 nm, the peak energy of the absorption spectra monotonically decreases, while that of the visible PL spectra unusually shows nonmonotonic behaviors having a minimum at diameter ∼17 nm. The PL behaviors can be attributed to the novel feature of GQDs, that is, the circular-to-polygonal-shape and corresponding edge-state variations of GQDs at diameter ∼17 nm as the GQD size increases, as demonstrated by high resolution transmission electron microscopy. We believe that such a comprehensive scheme in designing device architecture and the structural formulation of GQDs provides a device for practical realization of environmentally benign, high performance flexible devices in the future.

Keywords: graphene, quantum dot, size, photoluminescence

Procedia PDF Downloads 286
749 Organization of the Olfactory System and the Mushroom Body of the Weaver Ant, Oecophylla smaragdina

Authors: Rajashekhar K. Patil, Martin J. Babu

Abstract:

Weaver ants-Oecophylla smaragdina live in colonies that have polymorphic castes. The females which include the queen, major and minor workers are haploid. The individuals of castes are dependent on olfactory cues for carrying out caste-specific behaviour. In an effort to understand whether organizational differences exist to support these behavioural differences, we studied the olfactory system at the level of the sensilla on the antennae, olfactory glomeruli and the Kenyon cells in the mushroom bodies (MB). The MB differ in major and minor workers in terms of their size, with the major workers having relatively larger calyces and peduncle. The morphology of different types of Kenyon cells as revealed by Golgi-rapid staining was studied and the major workers had more dendritic arbors than minor workers. This suggests a greater degree of olfactory processing in major workers. Differences in caste-specific arrangement of sensilla, olfactory glomeruli and celluar architecture of MB indicate a developmental programme that forms basis of differential behaviour.

Keywords: ant, oecophylla, caste, mushroom body

Procedia PDF Downloads 467
748 Conceptualizing IoT Based Framework for Enhancing Environmental Accounting By ERP Systems

Authors: Amin Ebrahimi Ghadi, Morteza Moalagh

Abstract:

This research is carried out to find how a perfect combination of IoT architecture (Internet of Things) and ERP system can strengthen environmental accounting to incorporate both economic and environmental information. IoT (e.g., sensors, software, and other technologies) can be used in the company’s value chain from raw material extraction through materials processing, manufacturing products, distribution, use, repair, maintenance, and disposal or recycling products (Cradle to Grave model). The desired ERP software then will have the capability to track both midpoint and endpoint environmental impacts on a green supply chain system for the whole life cycle of a product. All these enable environmental accounting to calculate, and real-time analyze the operation environmental impacts, control costs, prepare for environmental legislation and enhance the decision-making process. In this study, we have developed a model on how to use IoT devices in life cycle assessment (LCA) to gather emissions, energy consumption, hazards, and wastes information to be processed in different modules of ERP systems in an integrated way for using in environmental accounting to achieve sustainability.

Keywords: ERP, environmental accounting, green supply chain, IOT, life cycle assessment, sustainability

Procedia PDF Downloads 168