Search results for: state space model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24780

Search results for: state space model

16020 Comprehensive Studio Tables: Improving Performance and Quality of Student's Work in Architecture Studio

Authors: Maryam Kalkatechi

Abstract:

Architecture students spent most of their qualitative time in studios during their years of study. The studio table’s importance as furniture in the studio is that it elevates the quality of the projects and positively influences the student’s productivity. This paper first describes the aspects considered in designing comprehensive studio table and later details on each aspect. Comprehensive studio tables are meant to transform the studio space to an efficient yet immense place of learning, collaboration, and participation. One aspect of these tables is that the surface transforms to a place of accommodation for design conversations, the other aspect of these tables is the efficient interactive platform of the tools. The discussion factors of the comprehensive studio include; the comprehensive studio setting of workspaces, the arrangement of the comprehensive studio tables, the collaboration aspects in the studio, the studio display and lightings shaped by the tables and lighting of the studio.

Keywords: studio tables, student performance, productivity, hologram, 3D printer

Procedia PDF Downloads 175
16019 Hierarchical Clustering Algorithms in Data Mining

Authors: Z. Abdullah, A. R. Hamdan

Abstract:

Clustering is a process of grouping objects and data into groups of clusters to ensure that data objects from the same cluster are identical to each other. Clustering algorithms in one of the areas in data mining and it can be classified into partition, hierarchical, density based, and grid-based. Therefore, in this paper, we do a survey and review for four major hierarchical clustering algorithms called CURE, ROCK, CHAMELEON, and BIRCH. The obtained state of the art of these algorithms will help in eliminating the current problems, as well as deriving more robust and scalable algorithms for clustering.

Keywords: clustering, unsupervised learning, algorithms, hierarchical

Procedia PDF Downloads 868
16018 Electron Beam Effects on Kinetic Alfven Waves in the Cold Homogenous Plasma

Authors: Jaya Shrivastava

Abstract:

The particle aspect approach is adopted to investigate the trajectories of charged particles in the electromagnetic field of kinetic Alfven wave. Expressions are found for the dispersion relation, growth/damping rate and associated currents in the presence of electron beam in homogenous plasma. Kinetic effects of electrons and ions are included to study kinetic Alfven wave because both are important in the transition region. The plasma parameters appropriate to plasma sheet boundary layer are used. It is found that downward electron beam affects the dispersion relation, growth/damping-rate and associated currents in cold electron limit.

Keywords: magnetospheric physics, plasma waves and instabilities, electron beam, space plasma physics, wave-particle interactions

Procedia PDF Downloads 379
16017 Development of Adaptive Architecture Classrooms through the Application of Augmented Reality in Private Universities of Malaysia

Authors: Sara Namdarian, Hafez Salleh

Abstract:

This paper scrutinizes the circumstances of the application of Augmented Reality (AR) technology to enhance the adaptability of architecture classrooms in private Malaysian university classrooms. This study aims to indicate the constraints of mono-functional classrooms in comparison to the potentials of multi-functional classrooms derived from AR application through an exploratory mixed method strategy. This paper expects to contribute towards recognition of suitable AR techniques which can be applied in the development of Adaptive-AR-Classroom-Systems (AARCS) in architecture classrooms. The findings, derived from the analysis, show current classrooms have limited functional spaces, and concludes that AR application can be used in design classrooms to provide a variety of visuals and virtual objects that are required in conducting architecture projects in higher educational centers.

Keywords: design activity, space enhancement, design education, architectural design augmented reality

Procedia PDF Downloads 428
16016 Public Functions of Kazakh Modern Literature

Authors: Erkingul Soltanaeva, Omyrkhan Abdimanuly, Alua Temirbolat

Abstract:

In this article, the public and social functions of literature and art in the Republic of Kazakhstan were analyzed on the basis of formal and informal literary organizations. The external and internal, subjective and objective factors which influenced the modern literary process were determined. The literary forces, their consolidation, types of organization in the art of word were examined. The periods of the literary process as planning, organization, promotion, and evaluation and their leading forces and approaches were analyzed. The right point of view to the language and mentality of the society force will influence to the literary process. The Ministry of Culture, the Writers' Union of RK and various non-governmental organizations are having different events for the promotion of literary process and to glorify literary personalities in the entire territory of Kazakhstan. According to the cultural plan of different state administration, there was a big program in order to publish their literary encyclopedia, to glorify and distribute books of own poets and writers of their region to the country. All of these official measures will increase the reader's interest in the book and will also bring up people to the patriotic education and improve the status of the native language. The professional literary publications such as the newspaper ‘Kazakh literature’, magazine ‘Zhuldyz’, and journal ‘Zhalyn’ materials which were published in the periods 2013-2015 on the basis of statistical analysis of the Kazakh literature topical to the issues and the field of themes are identified and their level of connection with the public situations are defined. The creative freedom, relations between society and the individual, the state of the literature, the problems of advantages and disadvantages were taken into consideration in the same articles. The level of functions was determined through the public role of literature, social feature, personal peculiarities. Now the stages as the literature management planning, organization, motivation, as well as the evaluation are forming and developing in Kazakhstan. But we still need the development of literature management to satisfy the actual requirements of the today’s agenda.

Keywords: literature management, material, literary process, social functions

Procedia PDF Downloads 370
16015 EQMamba - Method Suggestion for Earthquake Detection and Phase Picking

Authors: Noga Bregman

Abstract:

Accurate and efficient earthquake detection and phase picking are crucial for seismic hazard assessment and emergency response. This study introduces EQMamba, a deep-learning method that combines the strengths of the Earthquake Transformer and the Mamba model for simultaneous earthquake detection and phase picking. EQMamba leverages the computational efficiency of Mamba layers to process longer seismic sequences while maintaining a manageable model size. The proposed architecture integrates convolutional neural networks (CNNs), bidirectional long short-term memory (BiLSTM) networks, and Mamba blocks. The model employs an encoder composed of convolutional layers and max pooling operations, followed by residual CNN blocks for feature extraction. Mamba blocks are applied to the outputs of BiLSTM blocks, efficiently capturing long-range dependencies in seismic data. Separate decoders are used for earthquake detection, P-wave picking, and S-wave picking. We trained and evaluated EQMamba using a subset of the STEAD dataset, a comprehensive collection of labeled seismic waveforms. The model was trained using a weighted combination of binary cross-entropy loss functions for each task, with the Adam optimizer and a scheduled learning rate. Data augmentation techniques were employed to enhance the model's robustness. Performance comparisons were conducted between EQMamba and the EQTransformer over 20 epochs on this modest-sized STEAD subset. Results demonstrate that EQMamba achieves superior performance, with higher F1 scores and faster convergence compared to EQTransformer. EQMamba reached F1 scores of 0.8 by epoch 5 and maintained higher scores throughout training. The model also exhibited more stable validation performance, indicating good generalization capabilities. While both models showed lower accuracy in phase-picking tasks compared to detection, EQMamba's overall performance suggests significant potential for improving seismic data analysis. The rapid convergence and superior F1 scores of EQMamba, even on a modest-sized dataset, indicate promising scalability for larger datasets. This study contributes to the field of earthquake engineering by presenting a computationally efficient and accurate method for simultaneous earthquake detection and phase picking. Future work will focus on incorporating Mamba layers into the P and S pickers and further optimizing the architecture for seismic data specifics. The EQMamba method holds the potential for enhancing real-time earthquake monitoring systems and improving our understanding of seismic events.

Keywords: earthquake, detection, phase picking, s waves, p waves, transformer, deep learning, seismic waves

Procedia PDF Downloads 16
16014 Heuristic Algorithms for Time Based Weapon-Target Assignment Problem

Authors: Hyun Seop Uhm, Yong Ho Choi, Ji Eun Kim, Young Hoon Lee

Abstract:

Weapon-target assignment (WTA) is a problem that assigns available launchers to appropriate targets in order to defend assets. Various algorithms for WTA have been developed over past years for both in the static and dynamic environment (denoted by SWTA and DWTA respectively). Due to the problem requirement to be solved in a relevant computational time, WTA has suffered from the solution efficiency. As a result, SWTA and DWTA problems have been solved in the limited situation of the battlefield. In this paper, the general situation under continuous time is considered by Time based Weapon Target Assignment (TWTA) problem. TWTA are studied using the mixed integer programming model, and three heuristic algorithms; decomposed opt-opt, decomposed opt-greedy, and greedy algorithms are suggested. Although the TWTA optimization model works inefficiently when it is characterized by a large size, the decomposed opt-opt algorithm based on the linearization and decomposition method extracted efficient solutions in a reasonable computation time. Because the computation time of the scheduling part is too long to solve by the optimization model, several algorithms based on greedy is proposed. The models show lower performance value than that of the decomposed opt-opt algorithm, but very short time is needed to compute. Hence, this paper proposes an improved method by applying decomposition to TWTA, and more practical and effectual methods can be developed for using TWTA on the battlefield.

Keywords: air and missile defense, weapon target assignment, mixed integer programming, piecewise linearization, decomposition algorithm, military operations research

Procedia PDF Downloads 324
16013 Heavy Metals and Carcinogenic Risk Assessment in Free-Ranged Livestock of Lead-Contaminated Goldmine Communities of Zamfara State, Northern Nigeria

Authors: Sulaiman Rabiu, Muazu Gusau Abubakar, Jafar Usman Zakari

Abstract:

The consumption of meat is of great importance as it provides a good source of proteins and significant amount of essential trace element to the body. However, contamination of meat and meat products with heavy metals is becoming a serious threat to food safety and public health. Therefore, the present study is aimed to evaluate the concentration of some heavy metals in muscles and entrails of free-ranged cattle, sheep and goats. A total of sixty (60) fresh samples of muscles, liver, kidney, small intestines and stomach of free ranged cattle, sheep and goats were collected from abattoirs of different goldmine communities of Anka, Bukkuyum, Maru andTalata-Mafara Local Government Areas of Zamfara State, Nigeria. The samples were digested using 10 mL of a mixed 70% high grade concentration of HNO₃ and 65% HCl (4:1 v/v); the mixture was heated until dense fumes disappeared forming a clear transparent solution and diluted to 50 mL with deionized water. Actual concentrations of Cd, Cr, Cu, Co, As, Ni, Mn, Pb and Zn were determined using Microwave Plasma Atomic Emission Spectrophotometer (MP-AES). From the results obtained, goat liver had the highest mean concentration of lead, arsenic, cobalt and manganese (12.43± 0.31, 14.25±0.32, 3.47± 0.86 and 12.68± 0.92 mg/kg respectively) while goat kidney had the highest concentration of copper and zinc (10.08±0.61 and 24.16±1.30 mg/kg respectively). The highest concentrations of cadmium and nickel were recorded in sheep kidney (7.75± 0.65 and 2.08±0.10 mg/kg respectively). Cattle muscles had the highest chromium concentration than all the organs analysed. The target hazard quotients (THQs) for all the metals were below 1.0, but TR which is a risk indices for carcinogenicity indicates an alarming result that requires stringent control to protect public health.Therefore, intensive public health awareness on the risk associated with contamination of heavy metals in meat should be advocated.

Keywords: contamination, goldmine, heavy metals, meat

Procedia PDF Downloads 77
16012 Experimental Determination of Aluminum 7075-T6 Parameters Using Stabilized Cycle Tests to Predict Thermal Ratcheting

Authors: Armin Rahmatfam, Mohammad Zehsaz, Farid Vakili Tahami, Nasser Ghassembaglou

Abstract:

In this paper the thermal ratcheting, kinematic hardening parameters C, γ, isotropic hardening parameters and also k, b, Q combined isotropic/kinematic hardening parameters have been obtained experimentally from the monotonic, strain controlled cyclic tests at room and elevated temperatures of 20°C, 100°C, and 400°C. These parameters are used in nonlinear combined isotropic/kinematic hardening model to predict better description of the loading and reloading cycles in the cyclic indentation as well as thermal ratcheting. For this purpose, three groups of specimens made of Aluminum 7075-T6 have been investigated. After each test and using stable hysteretic cycles, material parameters have been obtained for using in combined nonlinear isotropic/kinematic hardening models. Also the methodology of obtaining the correct kinematic/isotropic hardening parameters is presented.

Keywords: combined hardening model, kinematic hardening, isotropic hardening, cyclic tests

Procedia PDF Downloads 460
16011 Nonparametric Estimation of Risk-Neutral Densities via Empirical Esscher Transform

Authors: Manoel Pereira, Alvaro Veiga, Camila Epprecht, Renato Costa

Abstract:

This paper introduces an empirical version of the Esscher transform for risk-neutral option pricing. Traditional parametric methods require the formulation of an explicit risk-neutral model and are operational only for a few probability distributions for the returns of the underlying. In our proposal, we make only mild assumptions on the pricing kernel and there is no need for the formulation of the risk-neutral model for the returns. First, we simulate sample paths for the returns under the physical distribution. Then, based on the empirical Esscher transform, the sample is reweighted, giving rise to a risk-neutralized sample from which derivative prices can be obtained by a weighted sum of the options pay-offs in each path. We compare our proposal with some traditional parametric pricing methods in four experiments with artificial and real data.

Keywords: esscher transform, generalized autoregressive Conditional Heteroscedastic (GARCH), nonparametric option pricing

Procedia PDF Downloads 476
16010 Stock Prediction and Portfolio Optimization Thesis

Authors: Deniz Peksen

Abstract:

This thesis aims to predict trend movement of closing price of stock and to maximize portfolio by utilizing the predictions. In this context, the study aims to define a stock portfolio strategy from models created by using Logistic Regression, Gradient Boosting and Random Forest. Recently, predicting the trend of stock price has gained a significance role in making buy and sell decisions and generating returns with investment strategies formed by machine learning basis decisions. There are plenty of studies in the literature on the prediction of stock prices in capital markets using machine learning methods but most of them focus on closing prices instead of the direction of price trend. Our study differs from literature in terms of target definition. Ours is a classification problem which is focusing on the market trend in next 20 trading days. To predict trend direction, fourteen years of data were used for training. Following three years were used for validation. Finally, last three years were used for testing. Training data are between 2002-06-18 and 2016-12-30 Validation data are between 2017-01-02 and 2019-12-31 Testing data are between 2020-01-02 and 2022-03-17 We determine Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate as benchmarks which we should outperform. We compared our machine learning basis portfolio return on test data with return of Hold Stock Portfolio, Best Stock Portfolio and USD-TRY Exchange rate. We assessed our model performance with the help of roc-auc score and lift charts. We use logistic regression, Gradient Boosting and Random Forest with grid search approach to fine-tune hyper-parameters. As a result of the empirical study, the existence of uptrend and downtrend of five stocks could not be predicted by the models. When we use these predictions to define buy and sell decisions in order to generate model-based-portfolio, model-based-portfolio fails in test dataset. It was found that Model-based buy and sell decisions generated a stock portfolio strategy whose returns can not outperform non-model portfolio strategies on test dataset. We found that any effort for predicting the trend which is formulated on stock price is a challenge. We found same results as Random Walk Theory claims which says that stock price or price changes are unpredictable. Our model iterations failed on test dataset. Although, we built up several good models on validation dataset, we failed on test dataset. We implemented Random Forest, Gradient Boosting and Logistic Regression. We discovered that complex models did not provide advantage or additional performance while comparing them with Logistic Regression. More complexity did not lead us to reach better performance. Using a complex model is not an answer to figure out the stock-related prediction problem. Our approach was to predict the trend instead of the price. This approach converted our problem into classification. However, this label approach does not lead us to solve the stock prediction problem and deny or refute the accuracy of the Random Walk Theory for the stock price.

Keywords: stock prediction, portfolio optimization, data science, machine learning

Procedia PDF Downloads 66
16009 Towards an African Model: A Survey of Social Enterprises in South Africa

Authors: Kerryn Krige, Kerrin Myers

Abstract:

Social entrepreneurship offers the opportunity to simultaneously address both social and economic inequality in South Africa. Its appeal across racial groups, its attractiveness to young people, its applicability in rural and peri-urban markets, and its acceleration in middle income, large-business economies suits the South African context. However, the potential to deliver much-needed developmental benefits has not been realised because the social entrepreneurship debate lacks evidence as to who social entrepreneurs are, their goals and operations and the socio-economic results they achieve. As a result, policy development has been stunted, and legislative barriers and red tape remain. Social entrepreneurs are isolated from the mainstream economy, and struggle to access funding because of limitations in legislative and organisational structures. The objective of the study is to strengthen the ecosystem for social entrepreneurship in South Africa by producing robust, policy-rich information from and about social enterprises currently in operation across the country. The study employs a quantitative survey methodology, using online and telephonic data collection methods. A purposive sample of 1000 social enterprises was included in the first large-scale study of social entrepreneurship in South Africa. The results offer deep insight into the characteristics of social enterprises; the activities they undertake and the markets they serve; their modes of operation and funding sources as well as key challenges and support systems. The results contribute towards developing a model of social enterprise in the African context.

Keywords: social enterprise, key characteristics, challenges and enablers, towards an African model

Procedia PDF Downloads 288
16008 Trigonelline: A Promising Compound for The Treatment of Alzheimer's Disease

Authors: Mai M. Farid, Ximeng Yang, Tomoharu Kuboyama, Chihiro Tohda

Abstract:

Trigonelline is a major alkaloid component derived from Trigonella foenum-graecum L. (fenugreek) and has been reported before as a potential neuroprotective agent, especially in Alzheimer’s disease (AD). However, the previous data were unclear and used model mice were not well established. In the present study, the effect of trigonelline on memory function was investigated in Alzheimer’s disease transgenic model mouse, 5XFAD which overexpresses the mutated APP and PS1 genes. Oral administration of trigonelline for 14 days significantly enhanced object recognition and object location memories. Plasma and cerebral cortex were isolated at 30 min, 1h, 3h, and 6 h after oral administration of trigonelline. LC-MS/MS analysis indicated that trigonelline was detected in both plasma and cortex from 30 min after, suggesting good penetration of trigonelline into the brain. In addition, trigonelline significantly ameliorated axonal and dendrite atrophy in Amyloid β-treated cortical neurons. These results suggest that trigonelline could be a promising therapeutic candidate for AD.

Keywords: alzheimer’s disease, cortical neurons, LC-MS/MS analysis, trigonelline

Procedia PDF Downloads 132
16007 Knowledge Transfer and the Translation of Technical Texts

Authors: Ahmed Alaoui

Abstract:

This paper contributes to the ongoing debate as to the relevance of translation studies to professional practitioners. It exposes the various misconceptions permeating the links between theory and practice in the translation landscape in the Arab World. It is a thesis of this paper that specialization in translation should be redefined; taking account of the fact, that specialized knowledge alone is neither crucial nor sufficient in technical translation. It should be tested against the readability of the translated text, the appropriateness of its style and the usability of its content by end-users to carry out their intended tasks. The paper also proposes a preliminary model to establish a working link between theory and practice from the perspective of professional trainers and practitioners, calling for the latter to participate in the production of knowledge in a systematic fashion. While this proposal is driven by a rather intuitive conviction, a research line is needed to specify the methodological moves to establish the mediation strategies that would relate the components in the model of knowledge transfer proposed in this paper.

Keywords: knowledge transfer, misconceptions, specialized texts, translation theory, translation practice

Procedia PDF Downloads 381
16006 Synthesis and Characterization of the Carbon Spheres Built Up from Reduced Graphene Oxide

Authors: Takahiro Saida, Takahiro Kogiso, Takahiro Maruyama

Abstract:

The ordered structural carbon (OSC) material is expected to apply to the electrode of secondary batteries, the catalyst supports, and the biomaterials because it shows the low substance-diffusion resistance by its uniform pore size. In general, the OSC material is synthesized using the template material. Changing size and shape of this template provides the pore size of OSC material according to the purpose. Depositing the oxide nanosheets on the polymer sphere template by the layer by layer (LbL) method was reported as one of the preparation methods of OSC material. The LbL method can provide the controlling thickness of structural wall without the surface modification. When the preparation of the uniform carbon sphere prepared by the LbL method which composed of the graphene oxide wall and the polymethyl-methacrylate (PMMA) core, the reduction treatment will be the important object. Since the graphene oxide has poor electron conductivity due to forming a lot of functional groups on the surface, it could be hard to apply to the electrode of secondary batteries and the catalyst support of fuel cells. In this study, the graphene oxide wall of carbon sphere was reduced by the thermal treatment under the vacuum conditions, and its crystalline structure and electronic state were characterized. Scanning electron microscope images of the carbon sphere after the heat treatment at 300ºC showed maintaining sphere shape, but its shape was collapsed with increasing the heating temperature. In this time, the dissolution rate of PMMA core and the reduction rate of graphene oxide were proportionate to heating temperature. In contrast, extending the heating time was conducive to the conservation of the sphere shape. From results of X-ray photoelectron spectroscopy analysis, its electronic state of the surface was indicated mainly sp² carbon. From the above results, we succeeded in the synthesis of the sphere structure composed by the reduction graphene oxide.

Keywords: carbon sphere, graphene oxide, reduction, layer by layer

Procedia PDF Downloads 130
16005 Effect of Hydrogen Content and Structure in Diamond-Like Carbon Coatings on Hydrogen Permeation Properties

Authors: Motonori Tamura

Abstract:

The hydrogen barrier properties of the coatings of diamond-like carbon (DLC) were evaluated. Using plasma chemical vapor deposition and sputtering, DLC coatings were deposited on Type 316L stainless steels. The hydrogen permeation rate was reduced to 1/1000 or lower by the DLC coatings. The DLC coatings with high hydrogen content had high hydrogen barrier function. For hydrogen diffusion in coatings, the movement of atoms through hydrogen trap sites such as pores in coatings, and crystal defects such as dislocations, is important. The DLC coatings are amorphous, and there are both sp3 and sp2 bonds, and excess hydrogen could be found in the interstitial space and the hydrogen trap sites. In the DLC coatings with high hydrogen content, these hydrogen trap sites are likely already filled with hydrogen atoms, and the movement of new hydrogen atoms could be limited.

Keywords: hydrogen permeation, stainless steels, diamond-like carbon, hydrogen trap sites

Procedia PDF Downloads 323
16004 Predicting National Football League (NFL) Match with Score-Based System

Authors: Marcho Setiawan Handok, Samuel S. Lemma, Abdoulaye Fofana, Naseef Mansoor

Abstract:

This paper is proposing a method to predict the outcome of the National Football League match with data from 2019 to 2022 and compare it with other popular models. The model uses open-source statistical data of each team, such as passing yards, rushing yards, fumbles lost, and scoring. Each statistical data has offensive and defensive. For instance, a data set of anticipated values for a specific matchup is created by comparing the offensive passing yards obtained by one team to the defensive passing yards given by the opposition. We evaluated the model’s performance by contrasting its result with those of established prediction algorithms. This research is using a neural network to predict the score of a National Football League match and then predict the winner of the game.

Keywords: game prediction, NFL, football, artificial neural network

Procedia PDF Downloads 68
16003 Analysis and Design of Offshore Met Mast Supported on Jacket Substructure

Authors: Manu Manu, Pardha J. Saradhi, Ramana M. V. Murthy

Abstract:

Wind Energy is accepted as one of the most developed, cost effective and proven renewable energy technologies to meet increasing electricity demands in a sustainable manner. Preliminary assessment studies along Indian Coastline by Ministry of New and Renewable Energy have indicated prospects for development of offshore wind power along Tamil Nadu Coast, India. The commercial viability of a wind project mainly depends on wind characteristics on site. Hence, it is internationally recommended to perform site-specific wind resource assessment based on two years’ wind profile as a part of the feasibility study. Conventionally, guy wire met mast are used onshore for the collection of wind profile. Installation of similar structure in offshore requires complex marine spread and are very expensive. In the present study, an attempt is made to develop 120 m long lattice tower supported on the jacket, piled to the seabed at Rameshwaram, Tamil Nadu, India. Offshore met-masts are subjected to combined wind and hydrodynamic loads, and these lateral loads should be safely transferred to soil. The wind loads are estimated based on gust factor method, and the hydrodynamic loads are estimated by Morison’s equation along with suitable wave theory. The soil is modeled as three nonlinear orthogonal springs based on API standards. The structure configuration and optimum member sizes are obtained for extreme cyclone events. The dynamic behavior of mast under coupled wind and wave loads is also studied. The static responses of a mast with jacket type offshore platform have been studied using a frame model in SESAM. It is found from the study that the maximum displacement at the top of the mast for the random wave is 0.003 m and that of the tower for wind is 0.08 m during the steady state. The dynamic analysis results indicate that the structure is safe against coupled wind and wave loading.

Keywords: offshore wind, mast, static, aerodynamic load, hydrodynamic load

Procedia PDF Downloads 199
16002 The Data Quality Model for the IoT based Real-time Water Quality Monitoring Sensors

Authors: Rabbia Idrees, Ananda Maiti, Saurabh Garg, Muhammad Bilal Amin

Abstract:

IoT devices are the basic building blocks of IoT network that generate enormous volume of real-time and high-speed data to help organizations and companies to take intelligent decisions. To integrate this enormous data from multisource and transfer it to the appropriate client is the fundamental of IoT development. The handling of this huge quantity of devices along with the huge volume of data is very challenging. The IoT devices are battery-powered and resource-constrained and to provide energy efficient communication, these IoT devices go sleep or online/wakeup periodically and a-periodically depending on the traffic loads to reduce energy consumption. Sometime these devices get disconnected due to device battery depletion. If the node is not available in the network, then the IoT network provides incomplete, missing, and inaccurate data. Moreover, many IoT applications, like vehicle tracking and patient tracking require the IoT devices to be mobile. Due to this mobility, If the distance of the device from the sink node become greater than required, the connection is lost. Due to this disconnection other devices join the network for replacing the broken-down and left devices. This make IoT devices dynamic in nature which brings uncertainty and unreliability in the IoT network and hence produce bad quality of data. Due to this dynamic nature of IoT devices we do not know the actual reason of abnormal data. If data are of poor-quality decisions are likely to be unsound. It is highly important to process data and estimate data quality before bringing it to use in IoT applications. In the past many researchers tried to estimate data quality and provided several Machine Learning (ML), stochastic and statistical methods to perform analysis on stored data in the data processing layer, without focusing the challenges and issues arises from the dynamic nature of IoT devices and how it is impacting data quality. A comprehensive review on determining the impact of dynamic nature of IoT devices on data quality is done in this research and presented a data quality model that can deal with this challenge and produce good quality of data. This research presents the data quality model for the sensors monitoring water quality. DBSCAN clustering and weather sensors are used in this research to make data quality model for the sensors monitoring water quality. An extensive study has been done in this research on finding the relationship between the data of weather sensors and sensors monitoring water quality of the lakes and beaches. The detailed theoretical analysis has been presented in this research mentioning correlation between independent data streams of the two sets of sensors. With the help of the analysis and DBSCAN, a data quality model is prepared. This model encompasses five dimensions of data quality: outliers’ detection and removal, completeness, patterns of missing values and checks the accuracy of the data with the help of cluster’s position. At the end, the statistical analysis has been done on the clusters formed as the result of DBSCAN, and consistency is evaluated through Coefficient of Variation (CoV).

Keywords: clustering, data quality, DBSCAN, and Internet of things (IoT)

Procedia PDF Downloads 125
16001 The Effectiveness of First World Asylum Practices in Deterring Applications, Offering Bureaucratic Deniability, and Violating Human Rights: A Greek Case Study

Authors: Claudia Huerta, Pepijn Doornenbal, Walaa Elsiddig

Abstract:

Rising waves of nationalism around the world have led first-world migration receiving countries to exploit the ambiguity of international refugee law and establish asylum application processes that deter applications, allow for bureaucratic deniability, and violate human rights. This case study of Greek asylum application practices argues that the 'pre-application' asylum process in Greece violates the spirit of international law by making it incredibly difficult for potential asylum seekers to apply for asylum, in essence violating the human rights of thousands of asylum seekers. This study’s focus is on the Greek mainland’s asylum 'pre-application' process, which in 2016 began to require those wishing to apply for asylum to do so during extremely restricted hours via a basic Skype line. The average wait to simply begin the registration process to apply for asylum is 81 days, during which time applicants are forced to live illegally in Greece. This study’s methodology in analyzing the 'pre-application' process consists of hours of interviews with asylum seekers, NGOs, and the Asylum Service office on the ground in Athens, as well as an analysis of the Greek Asylum Service historical asylum registration statistics. This study presents three main findings: the delays associated with the Skype system in Greece are the result of system design, as proven by a statistical analysis of Greek asylum registrations, NGOs have been co-opted by the state to perform state functions during the process, and the government’s use of technology is both purposefully lazy and discriminatory. In conclusion, the study argues that such asylum practices are part of a pattern of first-world migration receiving countries policies’ which discourage asylum seekers from applying and fall short of the standards in international law.

Keywords: asylum, European Union, governance, Greece, irregular, migration, policy, refugee, Skype

Procedia PDF Downloads 112
16000 A Model Suggestion on Competitiveness and Sustainability of SMEs in Developing Countries

Authors: Ahmet Diken, Tahsin Karabulut

Abstract:

The factor which developing countries are in need is capital. Such countries make an effort to increase their income in order to meet their expenses for employment, infrastructure, superstructure investments, education, health and defense. The sole income of the countries is taxes collected from businesses. The businesses should drive profit and return in order to be able to toll. In a world where competition exists, different strategies may be followed by business in developing countries and they must specify their target markets. İn order to minimize cost and maximize profit, SMEs have to concentrate on target markets and select cost oriented strategy. In this study, a theoretical model is suggested that SME firms have to act as cluster between each other, and also must be optimal provider for large scale firms. SMEs’ policy must be supported by public. This relationship can benefit large scale firms to have brand over the world, and this organization increases value added for developing countries.

Keywords: competitiveness, countries, SMEs developing, sustainability

Procedia PDF Downloads 300
15999 Finite Element Analysis of Cold Formed Steel Screwed Connections

Authors: Jikhil Joseph, S. R. Satish Kumar

Abstract:

Steel Structures are commonly used for rapid erections and multistory constructions due to its inherent advantages. However, the high accuracy required in detailing and heavier sections, make it difficult to erect in place and transport. Cold Formed steel which are specially made by reducing carbon and other alloys are used nowadays to make thin-walled structures. Various types of connections are being reported as well as practiced for the thin-walled members such as bolting, riveting, welding and other mechanical connections. Commonly self-drilling screw connections are used for cold-formed purlin sheeting connection. In this paper an attempt is made to develop a moment resting frame which can be rapidly and remotely constructed with thin walled sections and self-drilling screws. Semi-rigid Moment connections are developed with Rectangular thin-walled tubes and the screws. The Finite Element Analysis programme ABAQUS is used for modelling the screwed connections. The various modelling procedures for simulating the connection behavior such as tie-constraint model, oriented spring model and solid interaction modelling are compared and are critically reviewed. From the experimental validations the solid-interaction modelling identified to be the most accurate one and are used for predicting the connection behaviors. From the finite element analysis, hysteresis curves and the modes of failure were identified. Parametric studies were done on the connection model to optimize the connection configurations to get desired connection characteristics.

Keywords: buckling, cold formed steel, finite element analysis, screwed connections

Procedia PDF Downloads 172
15998 Deep Neural Network Approach for Navigation of Autonomous Vehicles

Authors: Mayank Raj, V. G. Narendra

Abstract:

Ever since the DARPA challenge on autonomous vehicles in 2005, there has been a lot of buzz about ‘Autonomous Vehicles’ amongst the major tech giants such as Google, Uber, and Tesla. Numerous approaches have been adopted to solve this problem, which can have a long-lasting impact on mankind. In this paper, we have used Deep Learning techniques and TensorFlow framework with the goal of building a neural network model to predict (speed, acceleration, steering angle, and brake) features needed for navigation of autonomous vehicles. The Deep Neural Network has been trained on images and sensor data obtained from the comma.ai dataset. A heatmap was used to check for correlation among the features, and finally, four important features were selected. This was a multivariate regression problem. The final model had five convolutional layers, followed by five dense layers. Finally, the calculated values were tested against the labeled data, where the mean squared error was used as a performance metric.

Keywords: autonomous vehicles, deep learning, computer vision, artificial intelligence

Procedia PDF Downloads 143
15997 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 298
15996 Synthetic Method of Contextual Knowledge Extraction

Authors: Olga Kononova, Sergey Lyapin

Abstract:

Global information society requirements are transparency and reliability of data, as well as ability to manage information resources independently; particularly to search, to analyze, to evaluate information, thereby obtaining new expertise. Moreover, it is satisfying the society information needs that increases the efficiency of the enterprise management and public administration. The study of structurally organized thematic and semantic contexts of different types, automatically extracted from unstructured data, is one of the important tasks for the application of information technologies in education, science, culture, governance and business. The objectives of this study are the contextual knowledge typologization, selection or creation of effective tools for extracting and analyzing contextual knowledge. Explication of various kinds and forms of the contextual knowledge involves the development and use full-text search information systems. For the implementation purposes, the authors use an e-library 'Humanitariana' services such as the contextual search, different types of queries (paragraph-oriented query, frequency-ranked query), automatic extraction of knowledge from the scientific texts. The multifunctional e-library «Humanitariana» is realized in the Internet-architecture in WWS-configuration (Web-browser / Web-server / SQL-server). Advantage of use 'Humanitariana' is in the possibility of combining the resources of several organizations. Scholars and research groups may work in a local network mode and in distributed IT environments with ability to appeal to resources of any participating organizations servers. Paper discusses some specific cases of the contextual knowledge explication with the use of the e-library services and focuses on possibilities of new types of the contextual knowledge. Experimental research base are science texts about 'e-government' and 'computer games'. An analysis of the subject-themed texts trends allowed to propose the content analysis methodology, that combines a full-text search with automatic construction of 'terminogramma' and expert analysis of the selected contexts. 'Terminogramma' is made out as a table that contains a column with a frequency-ranked list of words (nouns), as well as columns with an indication of the absolute frequency (number) and the relative frequency of occurrence of the word (in %% ppm). The analysis of 'e-government' materials showed, that the state takes a dominant position in the processes of the electronic interaction between the authorities and society in modern Russia. The media credited the main role in these processes to the government, which provided public services through specialized portals. Factor analysis revealed two factors statistically describing the used terms: human interaction (the user) and the state (government, processes organizer); interaction management (public officer, processes performer) and technology (infrastructure). Isolation of these factors will lead to changes in the model of electronic interaction between government and society. In this study, the dominant social problems and the prevalence of different categories of subjects of computer gaming in science papers from 2005 to 2015 were identified. Therefore, there is an evident identification of several types of contextual knowledge: micro context; macro context; dynamic context; thematic collection of queries (interactive contextual knowledge expanding a composition of e-library information resources); multimodal context (functional integration of iconographic and full-text resources through hybrid quasi-semantic algorithm of search). Further studies can be pursued both in terms of expanding the resource base on which they are held, and in terms of the development of appropriate tools.

Keywords: contextual knowledge, contextual search, e-library services, frequency-ranked query, paragraph-oriented query, technologies of the contextual knowledge extraction

Procedia PDF Downloads 342
15995 Real-World Economic Burden of Musculoskeletal Disorders in Nigeria

Authors: F. Fatoye, C. E. Mbada, T. Gebrye, A. O. Ogunsola, C. Fatoye, O. Oyewole

Abstract:

Musculoskeletal disorders (MSDs) such as low back pain (LBP), cervical spondylosis (CSPD), sprain, osteoarthritis (OA), and post immobilization stiffness (PIS) have a major impact on individuals, health systems and society in terms of morbidity, long-term disability, and economics. This study estimated the direct and indirect costs of common MSDs in Osun State, Nigeria. A review of medical charts for adult patients attending Physiotherapy Outpatient Clinic at the Obafemi Awolowo University Teaching Hospitals Complex, Osun State, Nigeria between 2009 and 2018 was carried out. The occupational class of the patients was determined using the International Labour Classification (ILO). The direct and indirect costs were estimated using a cost-of-illness approach. Physiotherapy related health resource use, and costs of the common MSDs, including consultation fee, total fee charge per session, costs of consumables were estimated. Data were summarised using descriptive statistics mean and standard deviation (SD). Overall, 1582 (Male = 47.5%, Female = 52.5%) patients with MSDs population with a mean age of 47.8 ± 25.7 years participated in this study. The mean (SD) direct costs estimate for LBP, CSPD, PIS, sprain, OA, and other conditions were $18.35 ($17.33), $34.76 ($17.33), $32.13 ($28.37), $35.14 ($44.16), $37.19 ($41.68), and $15.74 ($13.96), respectively. The mean (SD) indirect costs estimate of LBP, CSPD, PIS, sprain, OA, and other MSD conditions were $73.42 ($43.54), $140.57 ($69.31), $128.52 ($113.46), sprain $140.57 ($69.31), $148.77 ($166.71), and $62.98 ($55.84), respectively. Musculoskeletal disorders contribute a substantial economic burden to individuals with the condition and society. The unacceptable economic loss of MSDs should be reduced using appropriate strategies. Further research is required to determine the clinical and cost effectiveness of strategies to improve health outcomes of patients with MSDs. The findings of the present study may assist health policy and decision makers to understand the economic burden of MSDs and facilitate efficient allocation of healthcare resources to alleviate the burden associated with these conditions in Nigeria.

Keywords: economic burden, low back pain, musculoskeletal disorders, real-world

Procedia PDF Downloads 206
15994 Post-Soviet Georgia in Visual History Analysis

Authors: Ana Nemsadze

Abstract:

Contemporary era and society are called postindustrial era and postindustrial society and/or informational era and informational society. Today science intends to define concept of information and comprehend informations role and function in contemporary society. Organization of social environment and governance of public processes on the base of information and tools of communication are main characteristics of informational era. This was defined by technological changes which were accomplished in culture in the second half of twentieth century. Today Georgia as an independent state needs to create an informational discourse of the country and therefore it is very important to study political and social cases which accomplished in the country after collapse of the Soviet Union because they start to define the present and the future of the country. The purpose of this study is to analyze political cases of the latest history of Georgia in terms of culture and information, concretely to elucidate which political cases transformed social life of post Soviet Georgia most of all who accomplished these political cases which visual and verbal messages was each political case spread with. The research is conducted on the base of interview. Participants of the interview are people of various specializations. Their professional activity is related to reflections on culture and theme of visual communication. They are philosophers sociologists a journalist media researcher a politologist a painter. The participants of the interview enumerated political cases and characterized them separately. Every expert thinks that declaration of independence of Georgia is the most important fact among all facts which were implemented in Georgia after collapse of the Soviet Union. The research revealed important social and political cases. Most of the cases are related to independence and territorial integrity of the state. Presidents of Georgia Zviad Gamsakhurdia Eduard Shevardnadze Mikheil Saakashvili Catholocos-Patriarch of All Georgia, the Archbishop of Mtskheta Tbilisi and Metropolitan bishop of Bichvinta and Tskhum Abkhazia Ilia II, businessman Bidzina Ivanishvili assumed dominating roles in cases. Verbal narrative of the cases accomplished during Zviad Gamsakhurdia presidential term expresses national freedom and visual part of cases of the same period expresses ruin of social-political structure. Verbal narrative of the cases accomplished during Eduard Sevardnadze presidential term expresses Free State and stability and reestablishment of Georgias political function in international relations and visual part of cases of the same period describes the most important moment of his presidential term and Eduard Shevardnadzes face appears too. Verbal narrative of the cases accomplished during Mikheil Saakashvilis presidential term expresses social renewal and visual part of cases of the same period describes August war and Mikheil Saakashvilis face appears too. The results of the study also reveal other details of visual verbal narrative of political and social cases of post Soviet Georgia. This gives a chance to start further reflection.

Keywords: culture, narrative, post soviet, visual communication

Procedia PDF Downloads 292
15993 Automated Weight Painting: Using Deep Neural Networks to Adjust 3D Mesh Skeletal Weights

Authors: John Gibbs, Benjamin Flanders, Dylan Pozorski, Weixuan Liu

Abstract:

Weight Painting–adjusting the influence a skeletal joint has on a given vertex in a character mesh–is an arduous and time con- suming part of the 3D animation pipeline. This process generally requires a trained technical animator and many hours of work to complete. Our skiNNer plug-in, which works within Autodesk’s Maya 3D animation software, uses Machine Learning and data pro- cessing techniques to create a deep neural network model that can accomplish the weight painting task in seconds rather than hours for bipedal quasi-humanoid character meshes. In order to create a properly trained network, a number of challenges were overcome, including curating an appropriately large data library, managing an arbitrary 3D mesh size, handling arbitrary skeletal architectures, accounting for extreme numeric values (most data points are near 0 or 1 for weight maps), and constructing an appropriate neural network model that can properly capture the high frequency alter- ation between high weight values (near 1.0) and low weight values (near 0.0). The arrived at neural network model is a cross between a traditional CNN, deep residual network, and fully dense network. The resultant network captures the unusually hard-edged features of a weight map matrix, and produces excellent results on many bipedal models.

Keywords: 3d animation, animation, character, rigging, skinning, weight painting, machine learning, artificial intelligence, neural network, deep neural network

Procedia PDF Downloads 251
15992 Two-Channels Thermal Energy Storage Tank: Experiments and Short-Cut Modelling

Authors: M. Capocelli, A. Caputo, M. De Falco, D. Mazzei, V. Piemonte

Abstract:

This paper presents the experimental results and the related modeling of a thermal energy storage (TES) facility, ideated and realized by ENEA and realizing the thermocline with an innovative geometry. Firstly, the thermal energy exchange model of an equivalent shell & tube heat exchanger is described and tested to reproduce the performance of the spiral exchanger installed in the TES. Through the regression of the experimental data, a first-order thermocline model was also validated to provide an analytical function of the thermocline, useful for the performance evaluation and the comparison with other systems and implementation in simulations of integrated systems (e.g. power plants). The experimental data obtained from the plant start-up and the short-cut modeling of the system can be useful for the process analysis, for the scale-up of the thermal storage system and to investigate the feasibility of its implementation in actual case-studies.

Keywords: CSP plants, thermal energy storage, thermocline, mathematical modelling, experimental data

Procedia PDF Downloads 315
15991 Combining Multiscale Patterns of Weather and Sea States into a Machine Learning Classifier for Mid-Term Prediction of Extreme Rainfall in North-Western Mediterranean Sea

Authors: Pinel Sebastien, Bourrin François, De Madron Du Rieu Xavier, Ludwig Wolfgang, Arnau Pedro

Abstract:

Heavy precipitation constitutes a major meteorological threat in the western Mediterranean. Research has investigated the relationship between the states of the Mediterranean Sea and the atmosphere with the precipitation for short temporal windows. However, at a larger temporal scale, the precursor signals of heavy rainfall in the sea and atmosphere have drawn little attention. Moreover, despite ongoing improvements in numerical weather prediction, the medium-term forecasting of rainfall events remains a difficult task. Here, we aim to investigate the influence of early-spring environmental parameters on the following autumnal heavy precipitations. Hence, we develop a machine learning model to predict extreme autumnal rainfall with a 6-month lead time over the Spanish Catalan coastal area, based on i) the sea pattern (main current-LPC and Sea Surface Temperature-SST) at the mesoscale scale, ii) 4 European weather teleconnection patterns (NAO, WeMo, SCAND, MO) at synoptic scale, and iii) the hydrological regime of the main local river (Rhône River). The accuracy of the developed model classifier is evaluated via statistical analysis based on classification accuracy, logarithmic and confusion matrix by comparing with rainfall estimates from rain gauges and satellite observations (CHIRPS-2.0). Sensitivity tests are carried out by changing the model configuration, such as sea SST, sea LPC, river regime, and synoptic atmosphere configuration. The sensitivity analysis suggests a negligible influence from the hydrological regime, unlike SST, LPC, and specific teleconnection weather patterns. At last, this study illustrates how public datasets can be integrated into a machine learning model for heavy rainfall prediction and can interest local policies for management purposes.

Keywords: extreme hazards, sensitivity analysis, heavy rainfall, machine learning, sea-atmosphere modeling, precipitation forecasting

Procedia PDF Downloads 116