Search results for: multivariate time series data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 37853

Search results for: multivariate time series data

31973 Genetically Encoded Tool with Time-Resolved Fluorescence Readout for the Calcium Concentration Measurement

Authors: Tatiana R. Simonyan, Elena A. Protasova, Anastasia V. Mamontova, Eugene G. Maksimov, Konstantin A. Lukyanov, Alexey M. Bogdanov

Abstract:

Here, we describe two variants of the calcium indicators based on the GCaMP sensitive core and BrUSLEE fluorescent protein (GCaMP-BrUSLEE and GCaMP-BrUSLEE-145). In contrast to the conventional GCaMP6-family indicators, these fluorophores are characterized by the well-marked responsiveness of their fluorescence decay kinetics to external calcium concentration both in vitro and in cellulo. Specifically, we show that the purified GCaMP-BrUSLEE and GCaMP-BrUSLEE-145 exhibit three-component fluorescence decay kinetics, with the amplitude-normalized lifetime component (t3*A3) of GCaMP-BrUSLEE-145 changing four-fold (500-2000 a.u.) in response to a Ca²⁺ concentration shift in the range of 0—350 nM. Time-resolved fluorescence microscopy of live cells displays the two-fold change of the GCaMP-BrUSLEE-145 mean lifetime upon histamine-stimulated calcium release. The aforementioned Ca²⁺-dependence calls considering the GCaMP-BrUSLEE-145 as a prospective Ca²⁺-indicator with the signal read-out in the time domain.

Keywords: calcium imaging, fluorescence lifetime imaging microscopy, fluorescent proteins, genetically encoded indicators

Procedia PDF Downloads 133
31972 Mutual Authentication for Sensor-to-Sensor Communications in IoT Infrastructure

Authors: Shadi Janbabaei, Hossein Gharaee Garakani, Naser Mohammadzadeh

Abstract:

Internet of things is a new concept that its emergence has caused ubiquity of sensors in human life, so that at any time, all data are collected, processed and transmitted by these sensors. In order to establish a secure connection, the first challenge is authentication between sensors. However, this challenge also requires some features so that the authentication is done properly. Anonymity, untraceability, and being lightweight are among the issues that need to be considered. In this paper, we have evaluated the authentication protocols and have analyzed the security vulnerabilities found in them. Then an improved light weight authentication protocol for sensor-to-sensor communications is presented which uses the hash function and logical operators. The analysis of protocol shows that security requirements have been met and the protocol is resistant against various attacks. In the end, by decreasing the number of computational cost functions, it is argued that the protocol is lighter than before.

Keywords: anonymity, authentication, Internet of Things, lightweight, un-traceability

Procedia PDF Downloads 270
31971 Activity-Based Safety Assessment of Real Estate Projects in Western India

Authors: Patel Parul, Harsh Ganvit

Abstract:

The construction industry is the second highest industry after agriculture provides employment in India. In developing countries like India, many construction projects are coming up to meet the demand. On the one hand, construction projects are increasing; on the other hand still, construction companies are struggling with many problems. One of the major problems is to ensure safe working conditions at the construction site. Due to a lack of safety awareness and ignorance of safety aspects, many fatal accidents are very common at the construction site in India. One of the key success factors for construction projects is “Accident-Free Construction Projects”. The construction projects can be divided into various categories like Infrastructure projects, industrial construction and real estate construction. Real estate projects are mainly comprised of commercial and residential projects. In the construction industry, private sectors play a huge role in urban and rural development and also contribute significantly to the growth of the nation. Infrastructure and Industrial projects are mainly executed by well-qualified construction contractors. For such projects, ensuring safety at construction projects is inevitable and probably one of the major clauses of contract documents as well. These projects are monitored from time to time by national agencies and researchers, too. However, Real estate projects are rarely monitored for safety aspects. No systematic contract system is followed for these projects. Safety is the most neglected aspect of these projects. In the current research projects, an attempt is made to carry out safety auditing for about 75 real estate projects. The objective of this work is to collect the activity-based safety survey of real estate projects in western India. The analysis of activity-based safety implementation for real estate projects is discussed in the present work. The activities are divided into three categories based on the data collected. The findings of this work will help local monitoring authorities to implement a safety management plan for real estate projects.

Keywords: construction safety, safety assessment, activity-based safety, real estate projects

Procedia PDF Downloads 36
31970 Using Statistical Significance and Prediction to Test Long/Short Term Public Services and Patients' Cohorts: A Case Study in Scotland

Authors: Raptis Sotirios

Abstract:

Health and social care (HSc) services planning and scheduling are facing unprecedented challenges due to the pandemic pressure and also suffer from unplanned spending that is negatively impacted by the global financial crisis. Data-driven can help to improve policies, plan and design services provision schedules using algorithms assist healthcare managers’ to face unexpected demands using fewer resources. The paper discusses services packing using statistical significance tests and machine learning (ML) to evaluate demands similarity and coupling. This is achieved by predicting the range of the demand (class) using ML methods such as CART, random forests (RF), and logistic regression (LGR). The significance tests Chi-Squared test and Student test are used on data over a 39 years span for which HSc services data exist for services delivered in Scotland. The demands are probabilistically associated through statistical hypotheses that assume that the target service’s demands are statistically dependent on other demands as a NULL hypothesis. This linkage can be confirmed or not by the data. Complementarily, ML methods are used to linearly predict the above target demands from the statistically found associations and extend the linear dependence of the target’s demand to independent demands forming, thus groups of services. Statistical tests confirm ML couplings making the prediction also statistically meaningful and prove that a target service can be matched reliably to other services, and ML shows these indicated relationships can also be linear ones. Zero paddings were used for missing years records and illustrated better such relationships both for limited years and in the entire span offering long term data visualizations while limited years groups explained how well patients numbers can be related in short periods or can change over time as opposed to behaviors across more years. The prediction performance of the associations is measured using Receiver Operating Characteristic(ROC) AUC and ACC metrics as well as the statistical tests, Chi-Squared and Student. Co-plots and comparison tables for RF, CART, and LGR as well as p-values and Information Exchange(IE), are provided showing the specific behavior of the ML and of the statistical tests and the behavior using different learning ratios. The impact of k-NN and cross-correlation and C-Means first groupings is also studied over limited years and the entire span. It was found that CART was generally behind RF and LGR, but in some interesting cases, LGR reached an AUC=0 falling below CART, while the ACC was as high as 0.912, showing that ML methods can be confused padding or by data irregularities or outliers. On average, 3 linear predictors were sufficient, LGR was found competing RF well, and CART followed with the same performance at higher learning ratios. Services were packed only if when significance level(p-value) of their association coefficient was more than 0.05. Social factors relationships were observed between home care services and treatment of old people, birth weights, alcoholism, drug abuse, and emergency admissions. The work found that different HSc services can be well packed as plans of limited years, across various services sectors, learning configurations, as confirmed using statistical hypotheses.

Keywords: class, cohorts, data frames, grouping, prediction, prob-ability, services

Procedia PDF Downloads 210
31969 Control Scheme for Single-Stage Boost Inverter for Grid-Connected Photovoltaic

Authors: Mohammad Reza Ebrahimi, Behnaz Mahdaviani

Abstract:

Increasing renewable sources such photovoltaic are the reason of environmental pollution. Because photovoltaic generates power in low voltage, first, generated power should increase. Usually, distributed generation injects their power to AC-Grid, hence after voltage increasing an inverter is needed to convert DC power to AC power. This results in utilization two series converter that grows cost, complexity, and low efficiency. In this paper a single stage inverter is utilized to boost and invert in one stage. Control of this scheme is easier, and its initial cost decreases comparing to conventional double stage inverters. A simple control scheme is used to control active power as well as minimum total harmonic distortion (THD) in injected current. Simulations in MATLAB demonstrate better outputs comparing with conventional approaches.

Keywords: maximum power point tracking, boost inverter, control strategy, three phase inverter

Procedia PDF Downloads 349
31968 Design and Development of Data Visualization in 2D and 3D Space Using Front-End Technologies

Authors: Sourabh Yaduvanshi, Varsha Namdeo, Namrata Yaduvanshi

Abstract:

This study delves into the design and development intricacies of crafting detailed 2D bar charts via d3.js, recognizing its limitations in generating 3D visuals within the DOM. The study combines three.js with d3.js, facilitating a smooth evolution from 2D to immersive 3D representations. This fusion epitomizes the synergy between front-end technologies, expanding horizons in data visualization. Beyond technical expertise, it symbolizes a creative convergence, pushing boundaries in visual representation. The abstract illuminates methodologies, unraveling the intricate integration of this fusion and guiding enthusiasts. It narrates a compelling story of transcending 2D constraints, propelling data visualization into captivating three-dimensional realms, and igniting creativity in front-end visualization endeavors.

Keywords: design, development, front-end technologies, visualization

Procedia PDF Downloads 55
31967 Removal of Vanadium from Industrial Effluents by Natural Ion Exchanger

Authors: Shashikant R. Kuchekar, Haribhau R. Aher, Priti M. Dhage

Abstract:

The removal vanadium from aqueous solution using natural exchanger was investigated. The effects of pH, contact time and exchanger dose were studied at ambient temperature (25 0C ± 2 0C). The equilibrium process was described by the Langmuir isotherm model with adsorption capacity for vanadium. The natural exchanger i.e. tamarindus seeds powder was treated with formaldehyde and sulpuric acid to increase the adsorptivity of metals. The maximum exchange level was attained as 80.1% at pH 3 with exchanger dose 5 g and contact time 60 min. Method is applied for removal of vanadium from industrial effluents.

Keywords: industrial effluent, natural ion exchange, Tamarindous indica, vanadium

Procedia PDF Downloads 229
31966 Evaluation of Gingival Hyperplasia Caused by Medications

Authors: Ilma Robo, Saimir Heta, Greta Plaka, Vera Ostreni

Abstract:

Purpose: Drug gingival hyperplasia is an uncommon pathology encountered during routine work in dental units. The purpose of this paper is to present the clinical appearance of gingival hyperplasia caused by medications. There are already three classes of medications that cause hyperplasia and based on data from the literature, the clinical cases encountered and included in this study have been compared. Materials and Methods: The study was conducted in a total of 311 patients, out of which 182 patients were included in our study, meeting the inclusion criteria. After each patient's history was recorded and it was found that patients were in their knowledge of chronic illness, undergoing treatment of gingivitis hypertrophic drugs was performed with a clinical examination of oral cavity and assessment by vertical and horizontal evaluation according to the periodontal indexes. Results: Of the data collected during the study, it was observed that 97% of patients with gingival hyperplasia are treated with nifedipine. 84% of patients treated with selected medicines and gingival hyperplasia in the oral cavity has been exposed at time period for more than 1 year and 1 month. According to the GOI, in the first rank of this index are about 21% of patients, in the second rank are 52%, in the third rank are 24% and in the fourth grade are 3%. According to the horizontal growth index of gingival hyperplasia, grade 1 included about 61% of patients and grade 2 included about 39% of patients with gingival hyperplasia. Bacterial index divides patients by degrees: grading 0 - 8.2%, grading 1 - 32.4%, grading 2 - 14% and grading 3 - 45.1%. Conclusions: The highest percentage of gingival hyperplasia caused by drugs is due to dosing of nifedipine for a duration of dosing and application for systemic healing for more than 1 year.

Keywords: drug gingival hyperplasia, horizontal growth index, vertical growth index

Procedia PDF Downloads 163
31965 Welding Process Selection for Storage Tank by Integrated Data Envelopment Analysis and Fuzzy Credibility Constrained Programming Approach

Authors: Rahmad Wisnu Wardana, Eakachai Warinsiriruk, Sutep Joy-A-Ka

Abstract:

Selecting the most suitable welding process usually depends on experiences or common application in similar companies. However, this approach generally ignores many criteria that can be affecting the suitable welding process selection. Therefore, knowledge automation through knowledge-based systems will significantly improve the decision-making process. The aims of this research propose integrated data envelopment analysis (DEA) and fuzzy credibility constrained programming approach for identifying the best welding process for stainless steel storage tank in the food and beverage industry. The proposed approach uses fuzzy concept and credibility measure to deal with uncertain data from experts' judgment. Furthermore, 12 parameters are used to determine the most appropriate welding processes among six competitive welding processes.

Keywords: welding process selection, data envelopment analysis, fuzzy credibility constrained programming, storage tank

Procedia PDF Downloads 148
31964 A Mathematical Model of Blood Perfusion Dependent Temperature Distribution in Transient Case in Human Dermal Region

Authors: Yogesh Shukla

Abstract:

Many attempts have been made to study temperature distribution problem in human tissues under normal environmental and physiological conditions at constant arterial blood temperature. But very few attempts have been made to investigate temperature distribution in human tissues under different arterial blood temperature. In view of above, a finite element model has been developed to unsteady temperature distribution in dermal region in human body. The model has been developed for one dimension unsteady state case. The variation in parameters like thermal conductivity, blood mass flow and metabolic activity with respect to position and time has been incorporated in the model. Appropriate boundary conditions have been framed. The central difference approach has been used in space variable and trapezoidal rule has been employed a long time variable. Numerical results have been obtained to study relationship among temperature and time.

Keywords: rate of metabolism, blood mass flow rate, thermal conductivity, heat generation, finite element method

Procedia PDF Downloads 334
31963 A Review: Artificial Intelligence (AI) Driven User Access Management and Identity Governance

Authors: Rupan Preet Kaur

Abstract:

This article reviewed the potential of artificial intelligence in the field of identity and access management (IAM) and identity governance and administration (IGA), the most critical pillars of any organization. The power of leveraging AI in the most complex and huge user base environment was outlined by simplifying and streamlining the user access approvals and re-certifications without any impact on the user productivity and at the same time strengthening the overall compliance of IAM landscape. Certain challenges encountered in the current state were detailed where majority of organizations are still lacking maturity in the data integrity aspect. Finally, this paper concluded that within the realm of possibility, users and application owners can reap the benefits of unified approach provided by AI to improve the user experience, improve overall efficiency, and strengthen the risk posture.

Keywords: artificial intelligence, machine learning, user access review, access approval

Procedia PDF Downloads 71
31962 On the Dwindling Supply of the Observable Cosmic Microwave Background Radiation

Authors: Jia-Chao Wang

Abstract:

The cosmic microwave background radiation (CMB) freed during the recombination era can be considered as a photon source of small duration; a one-time event happened everywhere in the universe simultaneously. If space is divided into concentric shells centered at an observer’s location, one can imagine that the CMB photons originated from the nearby shells would reach and pass the observer first, and those in shells farther away would follow as time goes forward. In the Big Bang model, space expands rapidly in a time-dependent manner as described by the scale factor. This expansion results in an event horizon coincident with one of the shells, and its radius can be calculated using cosmological calculators available online. Using Planck 2015 results, its value during the recombination era at cosmological time t = 0.379 million years (My) is calculated to be Revent = 56.95 million light-years (Mly). The event horizon sets a boundary beyond which the freed CMB photons will never reach the observer. The photons within the event horizon also exhibit a peculiar behavior. Calculated results show that the CMB observed today was freed in a shell located at 41.8 Mly away (inside the boundary set by Revent) at t = 0.379 My. These photons traveled 13.8 billion years (Gy) to reach here. Similarly, the CMB reaching the observer at t = 1, 5, 10, 20, 40, 60, 80, 100 and 120 Gy are calculated to be originated at shells of R = 16.98, 29.96, 37.79, 46.47, 53.66, 55.91, 56.62, 56.85 and 56.92 Mly, respectively. The results show that as time goes by, the R value approaches Revent = 56.95 Mly but never exceeds it, consistent with the earlier statement that beyond Revent the freed CMB photons will never reach the observer. The difference Revert - R can be used as a measure of the remaining observable CMB photons. Its value becomes smaller and smaller as R approaching Revent, indicating a dwindling supply of the observable CMB radiation. In this paper, detailed dwindling effects near the event horizon are analyzed with the help of online cosmological calculators based on the lambda cold dark matter (ΛCDM) model. It is demonstrated in the literature that assuming the CMB to be a blackbody at recombination (about 3000 K), then it will remain so over time under cosmological redshift and homogeneous expansion of space, but with the temperature lowered (2.725 K now). The present result suggests that the observable CMB photon density, besides changing with space expansion, can also be affected by the dwindling supply associated with the event horizon. This raises the question of whether the blackbody of CMB at recombination can remain so over time. Being able to explain the blackbody nature of the observed CMB is an import part of the success of the Big Bang model. The present results cast some doubts on that and suggest that the model may have an additional challenge to deal with.

Keywords: blackbody of CMB, CMB radiation, dwindling supply of CMB, event horizon

Procedia PDF Downloads 105
31961 On the Estimation of Crime Rate in the Southwest of Nigeria: Principal Component Analysis Approach

Authors: Kayode Balogun, Femi Ayoola

Abstract:

Crime is at alarming rate in this part of world and there are many factors that are contributing to this antisocietal behaviour both among the youths and old. In this work, principal component analysis (PCA) was used as a tool to reduce the dimensionality and to really know those variables that were crime prone in the study region. Data were collected on twenty-eight crime variables from National Bureau of Statistics (NBS) databank for a period of fifteen years, while retaining as much of the information as possible. We use PCA in this study to know the number of major variables and contributors to the crime in the Southwest Nigeria. The results of our analysis revealed that there were eight principal variables have been retained using the Scree plot and Loading plot which implies an eight-equation solution will be appropriate for the data. The eight components explained 93.81% of the total variation in the data set. We also found that the highest and commonly committed crimes in the Southwestern Nigeria were: Assault, Grievous Harm and Wounding, theft/stealing, burglary, house breaking, false pretence, unlawful arms possession and breach of public peace.

Keywords: crime rates, data, Southwest Nigeria, principal component analysis, variables

Procedia PDF Downloads 421
31960 Bianchi Type- I Viscous Fluid Cosmological Models with Stiff Matter and Time Dependent Λ- Term

Authors: Rajendra Kumar Dubey

Abstract:

Einstein’s field equations with variable cosmological term Λ are considered in the presence of viscous fluid for Bianchi type I space time. Exact solutions of Einstein’s field equations are obtained by assuming cosmological term Λ Proportional to (R is a scale factor and m is constant). We observed that the shear viscosity is found to be responsible for faster removal of initial anisotropy in the universe. The physical significance of the cosmological models has also been discussed.

Keywords: bianchi type, I cosmological model, viscous fluid, cosmological constant Λ

Procedia PDF Downloads 511
31959 Enhancing the Bionic Eye: A Real-time Image Optimization Framework to Encode Color and Spatial Information Into Retinal Prostheses

Authors: William Huang

Abstract:

Retinal prostheses are currently limited to low resolution grayscale images that lack color and spatial information. This study develops a novel real-time image optimization framework and tools to encode maximum information to the prostheses which are constrained by the number of electrodes. One key idea is to localize main objects in images while reducing unnecessary background noise through region-contrast saliency maps. A novel color depth mapping technique was developed through MiniBatchKmeans clustering and color space selection. The resulting image was downsampled using bicubic interpolation to reduce image size while preserving color quality. In comparison to current schemes, the proposed framework demonstrated better visual quality in tested images. The use of the region-contrast saliency map showed improvements in efficacy up to 30%. Finally, the computational speed of this algorithm is less than 380 ms on tested cases, making real-time retinal prostheses feasible.

Keywords: retinal implants, virtual processing unit, computer vision, saliency maps, color quantization

Procedia PDF Downloads 127
31958 Entrepreneurial Venture Creation through Anchor Event Activities: Pop-Up Stores as On-Site Arenas

Authors: Birgit A. A. Solem, Kristin Bentsen

Abstract:

Scholarly attention in entrepreneurship is currently directed towards understanding entrepreneurial venture creation as a process -the journey of new economic activities from nonexistence to existence often studied through flow- or network models. To complement existing research on entrepreneurial venture creation with more interactivity-based research of organized activities, this study examines two pop-up stores as anchor events involving on-site activities of fifteen participating entrepreneurs launching their new ventures. The pop-up stores were arranged in two middle-sized Norwegian cities and contained different brand stores that brought together actors of sub-networks and communities executing venture creation activities. The pop-up stores became on-site arenas for the entrepreneurs to create, maintain, and rejuvenate their networks, at the same time as becoming venues for temporal coordination of activities involving existing and potential customers in their venture creation. In this work, we apply a conceptual framework based on frequently addressed dilemmas within entrepreneurship theory (discovery/creation, causation/effectuation) to further shed light on the broad aspect of on-site anchor event activities and their venture creation outcomes. The dilemma-based concepts are applied as an analytic toolkit to pursue answers regarding the nature of anchor event activities typically found within entrepreneurial venture creation and how these anchor event activities affect entrepreneurial venture creation outcomes. Our study combines researcher participation with 200 hours of observation and twenty in-depth interviews. Data analysis followed established guidelines for hermeneutic analysis and was intimately intertwined with ongoing data collection. Data was coded and categorized in NVivo 12 software, and iterated several times as patterns were steadily developing. Our findings suggest that core anchor event activities typically found within entrepreneurial venture creation are; a concept- and product experimentation with visitors, arrangements to socialize (evening specials, auctions, and exhibitions), store-in-store concepts, arranged meeting places for peers and close connection with municipality and property owners. Further, this work points to four main entrepreneurial venture creation outcomes derived from the core anchor event activities; (1) venture attention, (2) venture idea-realization, (3) venture collaboration, and (4) venture extension. Our findings show that, depending on which anchor event activities are applied, the outcomes vary. Theoretically, this study offers two main implications. First, anchor event activities are both discovered and created, following the logic of causation, at the same time as being experimental, based on “learning by doing” principles of effectuation during the execution. Second, our research enriches prior studies on venture creation as a process. In this work, entrepreneurial venture creation activities and outcomes are understood through pop-up stores as on-site anchor event arenas, particularly suitable for interactivity-based research requested by the entrepreneurship field. This study also reveals important managerial implications, such as that entrepreneurs should allow themselves to find creative physical venture creation arenas (e.g., pop-up stores, showrooms), as well as collaborate with partners when discovering and creating concepts and activities based on new ideas. In this way, they allow themselves to both strategically plan for- and continually experiment with their venture.

Keywords: anchor event, interactivity-based research, pop-up store, entrepreneurial venture creation

Procedia PDF Downloads 72
31957 Prediction of Formation Pressure Using Artificial Intelligence Techniques

Authors: Abdulmalek Ahmed

Abstract:

Formation pressure is the main function that affects drilling operation economically and efficiently. Knowing the pore pressure and the parameters that affect it will help to reduce the cost of drilling process. Many empirical models reported in the literature were used to calculate the formation pressure based on different parameters. Some of these models used only drilling parameters to estimate pore pressure. Other models predicted the formation pressure based on log data. All of these models required different trends such as normal or abnormal to predict the pore pressure. Few researchers applied artificial intelligence (AI) techniques to predict the formation pressure by only one method or a maximum of two methods of AI. The objective of this research is to predict the pore pressure based on both drilling parameters and log data namely; weight on bit, rotary speed, rate of penetration, mud weight, bulk density, porosity and delta sonic time. A real field data is used to predict the formation pressure using five different artificial intelligence (AI) methods such as; artificial neural networks (ANN), radial basis function (RBF), fuzzy logic (FL), support vector machine (SVM) and functional networks (FN). All AI tools were compared with different empirical models. AI methods estimated the formation pressure by a high accuracy (high correlation coefficient and low average absolute percentage error) and outperformed all previous. The advantage of the new technique is its simplicity, which represented from its estimation of pore pressure without the need of different trends as compared to other models which require a two different trend (normal or abnormal pressure). Moreover, by comparing the AI tools with each other, the results indicate that SVM has the advantage of pore pressure prediction by its fast processing speed and high performance (a high correlation coefficient of 0.997 and a low average absolute percentage error of 0.14%). In the end, a new empirical correlation for formation pressure was developed using ANN method that can estimate pore pressure with a high precision (correlation coefficient of 0.998 and average absolute percentage error of 0.17%).

Keywords: Artificial Intelligence (AI), Formation pressure, Artificial Neural Networks (ANN), Fuzzy Logic (FL), Support Vector Machine (SVM), Functional Networks (FN), Radial Basis Function (RBF)

Procedia PDF Downloads 135
31956 Development of Web Application for Warehouse Management System: A Case Study of Ceramics Factory

Authors: Thanaphat Suwanaklang, Supaporn Suwannarongsri

Abstract:

Presently, there are many industries in Thailand producing various products for both domestic distribution and export to foreign countries. Warehouse is one of the most important areas of business needing to store their products. Such businesses need to have a suitable warehouse management system for reducing the storage time and using the space as much as possible. This paper proposes the development of a web application for a warehouse management system. One of the ceramics factories in Thailand is conducted as a case study. By applying the ABC analysis, fixed location, commodity system, ECRS, and 7-waste theories and principles, the web application for the warehouse management system of the selected ceramics factory is developed to design the optimal storage area for groups of products and design the optimal routes of forklifts. From experimental results, it was found that the warehouse management system developed via the web application can reduce the travel distance of forklifts and the time of searching for storage area by 100% once compared with the conventional method. In addition, the entire storage area can be on-line and real-time monitored.

Keywords: warehouse management system, warehouse design method, logistics system, web application

Procedia PDF Downloads 119
31955 The Military and Motherhood: Identity and Role Expectation within Two Greedy Institutions

Authors: Maureen Montalban

Abstract:

The military is a predominantly male-dominated organisation that has entrenched hierarchical and patriarchal norms. Since 1975, women have been allowed to continue active service in the Australian Defence Force during pregnancy and after the birth of a child; prior to this time, pregnancy was grounds for automatic termination. The military and family, as institutions, make great demands on individuals with respect to their commitment, loyalty, time and energy. This research explores what it means to serve in the Australian Army as a woman through a gender lens, overlaid during a specific time period of their service; that is, during pregnancy, birth, and being a mother. It investigates the external demands faced by servicewomen who are mothers, whether it be from society, the Army, their teammates, their partners, or their children; and how they internally make sense of that with respect to their own identity and role as a mother, servicewoman, partner and as an individual. It also seeks to uncover how Australian Army servicewomen who are also mothers attempt to manage the dilemma of serving two greedy institutions when both expect and demand so much and whether this is, in fact, an impossible dilemma.

Keywords: women's health, gender studies, military culture, identity

Procedia PDF Downloads 88
31954 A Combined Error Control with Forward Euler Method for Dynamical Systems

Authors: R. Vigneswaran, S. Thilakanathan

Abstract:

Variable time-stepping algorithms for solving dynamical systems performed poorly for long time computations which pass close to a fixed point. To overcome this difficulty, several authors considered phase space error controls for numerical simulation of dynamical systems. In one generalized phase space error control, a step-size selection scheme was proposed, which allows this error control to be incorporated into the standard adaptive algorithm as an extra constraint at negligible extra computational cost. For this generalized error control, it was already analyzed the forward Euler method applied to the linear system whose coefficient matrix has real and negative eigenvalues. In this paper, this result was extended to the linear system whose coefficient matrix has complex eigenvalues with negative real parts. Some theoretical results were obtained and numerical experiments were carried out to support the theoretical results.

Keywords: adaptivity, fixed point, long time simulations, stability, linear system

Procedia PDF Downloads 298
31953 Architectural Robotics in Micro Living Spaces: An Approach to Enhancing Wellbeing

Authors: Timothy Antoniuk

Abstract:

This paper will demonstrate why the most successful and livable cities in the future will require multi-disciplinary designers to develop a deep understanding of peoples’ changing lifestyles, and why new generations of deeply integrated products, services and experiences need to be created. Disseminating research from the UNEP Creative Economy Reports and through a variety of other consumption and economic-based statistics, a compelling argument will be made that it is peoples’ living spaces that offer the easiest and most significant affordances for inducing positive changes to their wellbeing, and to a city’s economic and environmental prosperity. This idea, that leveraging happiness, wellbeing and prosperity through creating new concepts and typologies of ‘home’, puts people and their needs, wants, desires, aspirations and lifestyles at the beginning of the design process, not at the end, as so often occurs with current-day multi-unit housing construction. As an important part of the creative-reflective and statistical comparisons that are necessary for this on-going body of research and practice, Professor Antoniuk created the Micro Habitation Lab (mHabLab) in 2016. By focusing on testing the functional and economic feasibility of activating small spaces with different types of architectural robotics, a variety of movable, expandable and interactive objects have been hybridized and integrated into the architectural structure of the Lab. Allowing the team to test new ideas continually and accumulate thousands of points of feedback from everyday consumers, a series of on-going open houses is allowing the public-at-large to see, physically engage with, and give feedback on the items they find most and least valuable. This iterative approach of testing has exposed two key findings: Firstly, that there is a clear opportunity to improve the macro and micro functionality of small living spaces; and secondly, that allowing people to physically alter smaller elements of their living space lessens feelings of frustration and enhances feelings of pride and a deeper perception of “home”. Equally interesting to these findings is a grouping of new research questions that are being exposed which relate to: The duality of space; how people can be in two living spaces at one time; and how small living spaces is moving the Extended Home into the public realm.

Keywords: architectural robotics, extended home, interactivity, micro living spaces

Procedia PDF Downloads 150
31952 Multilevel Gray Scale Image Encryption through 2D Cellular Automata

Authors: Rupali Bhardwaj

Abstract:

Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.

Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient

Procedia PDF Downloads 355
31951 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: network worms, malware infection propagating malicious code, virus, security, VPN

Procedia PDF Downloads 341
31950 Subclasses of Bi-Univalent Functions Associated with Hohlov Operator

Authors: Rashidah Omar, Suzeini Abdul Halim, Aini Janteng

Abstract:

The coefficients estimate problem for Taylor-Maclaurin series is still an open problem especially for a function in the subclass of bi-univalent functions. A function f ϵ A is said to be bi-univalent in the open unit disk D if both f and f-1 are univalent in D. The symbol A denotes the class of all analytic functions f in D and it is normalized by the conditions f(0) = f’(0) – 1=0. The class of bi-univalent is denoted by  The subordination concept is used in determining second and third Taylor-Maclaurin coefficients. The upper bound for second and third coefficients is estimated for functions in the subclasses of bi-univalent functions which are subordinated to the function φ. An analytic function f is subordinate to an analytic function g if there is an analytic function w defined on D with w(0) = 0 and |w(z)| < 1 satisfying f(z) = g[w(z)]. In this paper, two subclasses of bi-univalent functions associated with Hohlov operator are introduced. The bound for second and third coefficients of functions in these subclasses is determined using subordination. The findings would generalize the previous related works of several earlier authors.

Keywords: analytic functions, bi-univalent functions, Hohlov operator, subordination

Procedia PDF Downloads 274
31949 Interactive IoT-Blockchain System for Big Data Processing

Authors: Abdallah Al-ZoubI, Mamoun Dmour

Abstract:

The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.

Keywords: IoT devices, blockchain, Ethereum, big data

Procedia PDF Downloads 128
31948 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: data analysis, interferon gamma release assay, statistical methods, tuberculosis infection

Procedia PDF Downloads 289
31947 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components

Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea

Abstract:

Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.

Keywords: assessment, part of speech, sentiment analysis, student feedback

Procedia PDF Downloads 122
31946 Modeling Palm Oil Quality During the Ripening Process of Fresh Fruits

Authors: Afshin Keshvadi, Johari Endan, Haniff Harun, Desa Ahmad, Farah Saleena

Abstract:

Experiments were conducted to develop a model for analyzing the ripening process of oil palm fresh fruits in relation to oil yield and oil quality of palm oil produced. This research was carried out on 8-year-old Tenera (Dura × Pisifera) palms planted in 2003 at the Malaysian Palm Oil Board Research Station. Fresh fruit bunches were harvested from designated palms during January till May of 2010. The bunches were divided into three regions (top, middle and bottom), and fruits from the outer and inner layers were randomly sampled for analysis at 8, 12, 16 and 20 weeks after anthesis to establish relationships between maturity and oil development in the mesocarp and kernel. Computations on data related to ripening time, oil content and oil quality were performed using several computer software programs (MSTAT-C, SAS and Microsoft Excel). Nine nonlinear mathematical models were utilized using MATLAB software to fit the data collected. The results showed mean mesocarp oil percent increased from 1.24 % at 8 weeks after anthesis to 29.6 % at 20 weeks after anthesis. Fruits from the top part of the bunch had the highest mesocarp oil content of 10.09 %. The lowest kernel oil percent of 0.03 % was recorded at 12 weeks after anthesis. Palmitic acid and oleic acid comprised of more than 73 % of total mesocarp fatty acids at 8 weeks after anthesis, and increased to more than 80 % at fruit maturity at 20 weeks. The Logistic model with the highest R2 and the lowest root mean square error was found to be the best fit model.

Keywords: oil palm, oil yield, ripening process, anthesis, fatty acids, modeling

Procedia PDF Downloads 293
31945 Vertical Distribution of the Monthly Average Values of the Air Temperature above the Territory of Kakheti in 2012-2017

Authors: Khatia Tavidashvili, Nino Jamrishvili, Valerian Omsarashvili

Abstract:

Studies of the vertical distribution of the air temperature in the atmosphere have great value for the solution of different problems of meteorology and climatology (meteorological forecast of showers, thunderstorms, and hail, weather modification, estimation of climate change, etc.). From the end of May 2015 in Kakheti after 25-year interruption, the work of anti-hail service was restored. Therefore, in connection with climate change, the need for the detailed study of the contemporary regime of the vertical distribution of the air temperature above this territory arose. In particular, the indicated information is necessary for the optimum selection of rocket means with the works on the weather modification (fight with the hail, the regulation of atmospheric precipitations, etc.). Construction of the detailed maps of the potential damage distribution of agricultural crops from the hail, etc. taking into account the dimensions of hailstones in the clouds according to the data of radar measurements and height of locality are the most important factors. For now, in Georgia, there is no aerological probing of atmosphere. To solve given problem we processed information about air temperature profiles above Telavi, at 27 km above earth's surface. Information was gathered during four observation time (4, 10, 16, 22 hours with local time. After research, we found vertical distribution of the average monthly values of the air temperature above Kakheti in ‎2012-2017 from January to December. Research was conducted from 0.543 to 27 km above sea level during four periods of research. In particular, it is obtained: -during January the monthly average air temperature linearly diminishes with 2.6 °C on the earth's surface to -57.1 °C at the height of 10 km, then little it changes up to the height of 26 km; the gradient of the air temperature in the layer of the atmosphere from 0.543 to 8 km - 6.3 °C/km; height of zero isotherm - is 1.33 km. -during July the air temperature linearly diminishes with 23.5 °C to -64.7 °C at the height of 17 km, then it grows to -47.5 °C at the height of 27 km; the gradient of the air temperature of - 6.1 °C/km; height of zero isotherm - is 4.39 km, which on 0.16 km is higher than in the sixties of past century.

Keywords: hail, Kakheti, meteorology, vertical distribution of the air temperature

Procedia PDF Downloads 152
31944 Effect of Temperature and Time on the Yield of Silica from Rice Husk Ash

Authors: Mohammed Adamu Musa, Shehu Saminu Babba

Abstract:

The technological trend towards waste utilization and cost reduction in industrial processing has attracted use of Rice Husk as a value added material. Both rice husk (RH) and Rice Husk Ash (RHA) has been found suitable for wide range of domestic as well as industrial applications. Therefore, the purpose of this research is to produce high grade sodium silicate from rice husk ash by considering the effect of temperature and time of heating as the process variables. The experiment was performed by heating the rice husk at temperatures 500 °C, 600 °C, 700 °C and 800 °C and time 60min, 90min, 120min and 150min were used to obtain the ash. 1.0M of aqueous sodium hydroxide solution was used to dissolve the silicate from the ash, which contained crude sodium silicate. In addition, the ash was neutralized by adding 5M of HCL until the pH reached 3.5 to give silica gel. At 6000C and 120mins, 94.23% silica was obtained from the RHA. At higher temperatures (700 °C and 800 °C) the percentage yield of silica reduced due to surface melting and carbon fixation in the lattice caused by presence of potassium. For this research, 600 °C is considered to be the optimum temperature for silica production from RHA. Silica produced from RHA can generate aggregate value and can be used in areas such as pulp and paper, plastic and rubber reinforcement industries.

Keywords: burning, rice husk, rice husk ash, silica, silica gel, temperature

Procedia PDF Downloads 215