Search results for: neural network models
6907 Design and Construction of Models of Sun Tracker or Sun Tracking System for Light Transmission
Authors: Mohsen Azarmjoo, Yasaman Azarmjoo, Zahra Alikhani Koopaei
Abstract:
This article introduces devices that can transfer sunlight to buildings that do not have access to direct sunlight during the day. The transmission and reflection of sunlight are done through the movement of movable mirrors. The focus of this article is on two models of sun tracker systems designed and built by the Macad team. In fact, this article will reveal the distinction between the two Macad devices and the previously built competitor device. What distinguishes the devices built by the Macad team from the competitor's device is the different mode of operation and the difference in the location of the sensors. Given that the devices have the same results, the Macad team has tried to reduce the defects of the competitor's device as much as possible. The special feature of the second type of device built by the Macad team has enabled buildings with different construction positions to use sun tracking systems. This article will also discuss diagrams of the path of sunlight transmission and more details of the device. It is worth mentioning that fixed mirrors are also placed next to the main devices. So that the light shining on the first device is reflected to these mirrors, this light is guided within the light receiver space and is transferred to the different parts around by steel sheets built in the light receiver space, and finally, these spaces benefit from sunlight.Keywords: design, construction, mechatronic device, sun tracker system, sun tracker, sunlight
Procedia PDF Downloads 886906 A Bayesian Multivariate Microeconometric Model for Estimation of Price Elasticity of Demand
Authors: Jefferson Hernandez, Juan Padilla
Abstract:
Estimation of price elasticity of demand is a valuable tool for the task of price settling. Given its relevance, it is an active field for microeconomic and statistical research. Price elasticity in the industry of oil and gas, in particular for fuels sold in gas stations, has shown to be a challenging topic given the market and state restrictions, and underlying correlations structures between the types of fuels sold by the same gas station. This paper explores the Lotka-Volterra model for the problem for price elasticity estimation in the context of fuels; in addition, it is introduced multivariate random effects with the purpose of dealing with errors, e.g., measurement or missing data errors. In order to model the underlying correlation structures, the Inverse-Wishart, Hierarchical Half-t and LKJ distributions are studied. Here, the Bayesian paradigm through Markov Chain Monte Carlo (MCMC) algorithms for model estimation is considered. Simulation studies covering a wide range of situations were performed in order to evaluate parameter recovery for the proposed models and algorithms. Results revealed that the proposed algorithms recovered quite well all model parameters. Also, a real data set analysis was performed in order to illustrate the proposed approach.Keywords: price elasticity, volume, correlation structures, Bayesian models
Procedia PDF Downloads 1716905 Convergence Results of Two-Dimensional Homogeneous Elastic Plates from Truncation of Potential Energy
Authors: Erick Pruchnicki, Nikhil Padhye
Abstract:
Plates are important engineering structures which have attracted extensive research since the 19th century. The subject of this work is statical analysis of a linearly elastic homogenous plate under small deformations. A 'thin plate' is a three-dimensional structure comprising of a small transverse dimension with respect to a flat mid-surface. The general aim of any plate theory is to deduce a two-dimensional model, in terms of mid-surface quantities, to approximately and accurately describe the plate's deformation in terms of mid-surface quantities. In recent decades, a common starting point for this purpose is to utilize series expansion of a displacement field across the thickness dimension in terms of the thickness parameter (h). These attempts are mathematically consistent in deriving leading-order plate theories based on certain a priori scaling between the thickness and the applied loads; for example, asymptotic methods which are aimed at generating leading-order two-dimensional variational problems by postulating formal asymptotic expansion of the displacement fields. Such methods rigorously generate a hierarchy of two-dimensional models depending on the order of magnitude of the applied load with respect to the plate-thickness. However, in practice, applied loads are external and thus not directly linked or dependent on the geometry/thickness of the plate; thus, rendering any such model (based on a priori scaling) of limited practical utility. In other words, the main limitation of these approaches is that they do not furnish a single plate model for all orders of applied loads. Following analogy of recent efforts of deploying Fourier-series expansion to study convergence of reduced models, we propose two-dimensional model(s) resulting from truncation of the potential energy and rigorously prove the convergence of these two-dimensional plate models to the parent three-dimensional linear elasticity with increasing truncation order of the potential energy.Keywords: plate theory, Fourier-series expansion, convergence result, Legendre polynomials
Procedia PDF Downloads 1156904 PhenoScreen: Development of a Systems Biology Tool for Decision Making in Recurrent Urinary Tract Infections
Authors: Jonathan Josephs-Spaulding, Hannah Rettig, Simon Graspeunter, Jan Rupp, Christoph Kaleta
Abstract:
Background: Recurrent urinary tract infections (rUTIs) are a global cause of emergency room visits and represent a significant burden for public health systems. Therefore, metatranscriptomic approaches to investigate metabolic exchange and crosstalk between uropathogenic Escherichia coli (UPEC), which is responsible for 90% of UTIs, and collaborating pathogens of the urogenital microbiome is necessary to better understand the pathogenetic processes underlying rUTIs. Objectives: This study aims to determine the level in which uropathogens optimize the host urinary metabolic environment to succeed during invasion. By developing patient-specific metabolic models of infection, these observations can be taken advantage of for the precision treatment of human disease. Methods: To date, we have set up an rUTI patient cohort and observed various urine-associated pathogens. From this cohort, we developed patient-specific metabolic models to predict bladder microbiome metabolism during rUTIs. This was done by creating an in silico metabolomic urine environment, which is representative of human urine. Metabolic models of uptake and cross-feeding of rUTI pathogens were created from genomes in relation to the artificial urine environment. Finally, microbial interactions were constrained by metatranscriptomics to indicate patient-specific metabolic requirements of pathogenic communities. Results: Metabolite uptake and cross-feeding are essential for strain growth; therefore, we plan to design patient-specific treatments by adjusting urinary metabolites through nutritional regimens to counteract uropathogens by depleting essential growth metabolites. These methods will provide mechanistic insights into the metabolic components of rUTI pathogenesis to provide an evidence-based tool for infection treatment.Keywords: recurrent urinary tract infections, human microbiome, uropathogenic Escherichia coli, UPEC, microbial ecology
Procedia PDF Downloads 1386903 Facility Data Model as Integration and Interoperability Platform
Authors: Nikola Tomasevic, Marko Batic, Sanja Vranes
Abstract:
Emerging Semantic Web technologies can be seen as the next step in evolution of the intelligent facility management systems. Particularly, this considers increased usage of open source and/or standardized concepts for data classification and semantic interpretation. To deliver such facility management systems, providing the comprehensive integration and interoperability platform in from of the facility data model is a prerequisite. In this paper, one of the possible modelling approaches to provide such integrative facility data model which was based on the ontology modelling concept was presented. Complete ontology development process, starting from the input data acquisition, ontology concepts definition and finally ontology concepts population, was described. At the beginning, the core facility ontology was developed representing the generic facility infrastructure comprised of the common facility concepts relevant from the facility management perspective. To develop the data model of a specific facility infrastructure, first extension and then population of the core facility ontology was performed. For the development of the full-blown facility data models, Malpensa and Fiumicino airports in Italy, two major European air-traffic hubs, were chosen as a test-bed platform. Furthermore, the way how these ontology models supported the integration and interoperability of the overall airport energy management system was analyzed as well.Keywords: airport ontology, energy management, facility data model, ontology modeling
Procedia PDF Downloads 4536902 Open Innovation in SMEs: A Multiple Case Study of Collaboration between Start-ups and Craft Enterprises
Authors: Carl-Philipp Valentin Beichert, Marcel Seger
Abstract:
Digital transformation and climate change require small and medium-sized enterprises (SME) to rethink their way of doing business. Inter-firm collaboration is recognized as helpful means of promoting innovation and competitiveness. In this context, collaborations with start-ups offer valuable opportunities through their innovative products, services, and business models. SMEs, and in particular German craft enterprises, play an important role in the country’s society and economy. Companies in this heterogeneous economic sector have unique characteristics and are limited in their ability to innovate due to their small size and lack of resources. Collaborating with start-ups could help to overcome these shortcomings. To investigate how collaborations emerge and what factors are decisive to successfully drive collaboration, we apply an explorative, qualitative research design. A sample of ten case studies was selected, with the collaboration between a start-up and a craft enterprise forming the unit of analysis. Semi-structured interviews with 20 company representatives allow for a two-sided perspective on the respective collaboration. The interview data is enriched by publicly available data and three expert interviews. As a result, objectives, initiation practices, applied collaboration types, barriers, as well as key success factors could be identified. The results indicate a three-phase collaboration process comprising an initiation, concept, and partner phase (ICP). The ICP framework proposed accordingly highlights the success factors (personal fit, communication, expertise, structure, network) for craft enterprises and start-ups for each collaboration phase. The role of a mediator in the start-up company, with strong expertise in the respective craft sector, is considered an important lever for overcoming barriers such as cultural and communication differences. The ICP framework thus provides promising directions for further research and can help practitioners establish successful collaborations.Keywords: open innovation, SME, craft businesses, startup collaboration, qualitative research
Procedia PDF Downloads 1006901 A Location-based Authentication and Key Management Scheme for Border Surveillance Wireless Sensor Networks
Authors: Walid Abdallah, Noureddine Boudriga
Abstract:
Wireless sensor networks have shown their effectiveness in the deployment of many critical applications especially in the military domain. Border surveillance is one of these applications where a set of wireless sensors are deployed along a country border line to detect illegal intrusion attempts to the national territory and report this to a control center to undergo the necessary measures. Regarding its nature, this wireless sensor network can be the target of many security attacks trying to compromise its normal operation. Particularly, in this application the deployment and location of sensor nodes are of great importance for detecting and tracking intruders. This paper proposes a location-based authentication and key distribution mechanism to secure wireless sensor networks intended for border surveillance where the key establishment is performed using elliptic curve cryptography and identity-based public key scheme. In this scheme, the public key of each sensor node will be authenticated by keys that depend on its position in the monitored area. Before establishing a pairwise key between two nodes, each one of them must verify the neighborhood location of the other node using a message authentication code (MAC) calculated on the corresponding public key and keys derived from encrypted beacon messages broadcast by anchor nodes. We show that our proposed public key authentication and key distribution scheme is more resilient to node capture and node replication attacks than currently available schemes. Also, the achievement of the key distribution between nodes in our scheme generates less communication overhead and hence increases network performances.Keywords: wireless sensor networks, border surveillance, security, key distribution, location-based
Procedia PDF Downloads 6646900 Exploring Socio-Economic Barriers of Green Entrepreneurship in Iran and Their Interactions Using Interpretive Structural Modeling
Authors: Younis Jabarzadeh, Rahim Sarvari, Negar Ahmadi Alghalandis
Abstract:
Entrepreneurship at both individual and organizational level is one of the most driving forces in economic development and leads to growth and competition, job generation and social development. Especially in developing countries, the role of entrepreneurship in economic and social prosperity is more emphasized. But the effect of global economic development on the environment is undeniable, especially in negative ways, and there is a need to rethink current business models and the way entrepreneurs act to introduce new businesses to address and embed environmental issues in order to achieve sustainable development. In this paper, green or sustainable entrepreneurship is addressed in Iran to identify challenges and barriers entrepreneurs in the economic and social sectors face in developing green business solutions. Sustainable or green entrepreneurship has been gaining interest among scholars in recent years and addressing its challenges and barriers need much more attention to fill the gap in the literature and facilitate the way those entrepreneurs are pursuing. This research comprised of two main phases: qualitative and quantitative. At qualitative phase, after a thorough literature review, fuzzy Delphi method is utilized to verify those challenges and barriers by gathering a panel of experts and surveying them. In this phase, several other contextually related factors were added to the list of identified barriers and challenges mentioned in the literature. Then, at the quantitative phase, Interpretive Structural Modeling is applied to construct a network of interactions among those barriers identified at the previous phase. Again, a panel of subject matter experts comprised of academic and industry experts was surveyed. The results of this study can be used by policymakers in both the public and industry sector, to introduce more systematic solutions to eliminate those barriers and help entrepreneurs overcome challenges of sustainable entrepreneurship. It also contributes to the literature as the first research in this type which deals with the barriers of sustainable entrepreneurship and explores their interaction.Keywords: green entrepreneurship, barriers, fuzzy Delphi method, interpretive structural modeling
Procedia PDF Downloads 1716899 An Insight into the Conformational Dynamics of Glycan through Molecular Dynamics Simulation
Authors: K. Veluraja
Abstract:
Glycan of glycolipids and glycoproteins is playing a significant role in living systems particularly in molecular recognition processes. Molecular recognition processes are attributed to their occurrence on the surface of the cell, sequential arrangement and type of sugar molecules present in the oligosaccharide structure and glyosidic linkage diversity (glycoinformatics) and conformational diversity (glycoconformatics). Molecular Dynamics Simulation study is a theoretical-cum-computational tool successfully utilized to establish glycoconformatics of glycan. The study on various oligosaccharides of glycan clearly indicates that oligosaccharides do exist in multiple conformational states and these conformational states arise due to the flexibility associated with a glycosidic torsional angle (φ,ψ) . As an example: a single disaccharide structure NeuNacα(2-3) Gal exists in three different conformational states due to the differences in the preferential value of glycosidic torsional angles (φ,ψ). Hence establishing three dimensional structural and conformational models for glycan (cartesian coordinates of every individual atoms of an oligosaccharide structure in a preferred conformation) is quite crucial to understand various molecular recognition processes such as glycan-toxin interaction and glycan-virus interaction. The gycoconformatics models obtained for various glycan through Molecular Dynamics Simulation stored in our 3DSDSCAR (3DSDSCAR.ORG) a public domain database and its utility value in understanding the molecular recognition processes and in drug design venture will be discussed.Keywords: glycan, glycoconformatics, molecular dynamics simulation, oligosaccharide
Procedia PDF Downloads 1406898 Continuous Differential Evolution Based Parameter Estimation Framework for Signal Models
Authors: Ammara Mehmood, Aneela Zameer, Muhammad Asif Zahoor Raja, Muhammad Faisal Fateh
Abstract:
In this work, the strength of bio-inspired computational intelligence based technique is exploited for parameter estimation for the periodic signals using Continuous Differential Evolution (CDE) by defining an error function in the mean square sense. Multidimensional and nonlinear nature of the problem emerging in sinusoidal signal models along with noise makes it a challenging optimization task, which is dealt with robustness and effectiveness of CDE to ensure convergence and avoid trapping in local minima. In the proposed scheme of Continuous Differential Evolution based Signal Parameter Estimation (CDESPE), unknown adjustable weights of the signal system identification model are optimized utilizing CDE algorithm. The performance of CDESPE model is validated through statistics based various performance indices on a sufficiently large number of runs in terms of estimation error, mean squared error and Thiel’s inequality coefficient. Efficacy of CDESPE is examined by comparison with the actual parameters of the system, Genetic Algorithm based outcomes and from various deterministic approaches at different signal-to-noise ratio (SNR) levels.Keywords: parameter estimation, bio-inspired computing, continuous differential evolution (CDE), periodic signals
Procedia PDF Downloads 3046897 Checking Energy Efficiency by Simulation Tools: The Case of Algerian Ksourian Models
Authors: Khadidja Rahmani, Nahla Bouaziz
Abstract:
Algeria is known for its rich heritage. It owns an immense historical heritage with a universal reputation. Unfortunately, this wealth is withered because of abundance. This research focuses on the Ksourian model, which constitutes a large portion of this wealth. In fact, the Ksourian model is not just a witness to a great part of history or a vernacular culture, but also it includes a panoply of assets in terms of energetic efficiency. In this context, the purpose of our work is to evaluate the performance of the old techniques which are derived from the Ksourian model , and that using the simulation tools. The proposed method is decomposed in two steps; the first consists of isolate and reintroduce each device into a basic model, then run a simulation series on acquired models. And this in order to test the contribution of each of these dialectal processes. In another scale of development, the second step consists of aggregating all these processes in an aboriginal model, then we restart the simulation, to see what it will give this mosaic on the environmental and energetic plan .The model chosen for this study is one of the ksar units of Knadsa city of Bechar (Algeria). This study does not only show the ingenuity of our ancestors in their know-how, and their adapting power to the aridity of the climate, but also proves that their conceptions subscribe in the current concerns of energy efficiency, and respond to the requirements of sustainable development.Keywords: dialectal processes, energy efficiency, evaluation, Ksourian model, simulation tools
Procedia PDF Downloads 2006896 Reverse Supply Chain Analysis of Lithium-Ion Batteries Considering Economic and Environmental Aspects
Authors: Aravind G., Arshinder Kaur, Pushpavanam S.
Abstract:
There is a strong emphasis on shifting to electric vehicles (EVs) throughout the globe for reducing the impact on global warming following the Paris climate accord. Lithium-ion batteries (LIBs) are predominantly used in EVs, and these can be a significant threat to the environment if not disposed of safely. Lithium is also a valuable resource not widely available. There are several research groups working on developing an efficient recycling process for LIBs. Two routes - pyrometallurgical and hydrometallurgical processes have been proposed for recycling LIBs. In this paper, we focus on life cycle assessment (LCA) as a tool to quantify the environmental impact of these recycling processes. We have defined the boundary of the LCA to include only the recycling phase of the end-of-life (EoL) of the battery life cycle. The analysis is done assuming ideal conditions for the hydrometallurgical and a combined hydrometallurgical and pyrometallurgical process in the inventory analysis. CML-IA method is used for quantifying the impact assessment across eleven indicators. Our results show that cathode, anode, and foil contribute significantly to the impact. The environmental impacts of both hydrometallurgical and combined recycling processes are similar across all the indicators. Further, the results of LCA are used in developing a multi-objective optimization model for the design of lithium-ion battery recycling network. Greenhouse gas emissions and cost are the two parameters minimized for the optimization study.Keywords: life cycle assessment, lithium-ion battery recycling, multi-objective optimization, network design, reverse supply chain
Procedia PDF Downloads 1626895 Disaggregation the Daily Rainfall Dataset into Sub-Daily Resolution in the Temperate Oceanic Climate Region
Authors: Mohammad Bakhshi, Firas Al Janabi
Abstract:
High resolution rain data are very important to fulfill the input of hydrological models. Among models of high-resolution rainfall data generation, the temporal disaggregation was chosen for this study. The paper attempts to generate three different rainfall resolutions (4-hourly, hourly and 10-minutes) from daily for around 20-year record period. The process was done by DiMoN tool which is based on random cascade model and method of fragment. Differences between observed and simulated rain dataset are evaluated with variety of statistical and empirical methods: Kolmogorov-Smirnov test (K-S), usual statistics, and Exceedance probability. The tool worked well at preserving the daily rainfall values in wet days, however, the generated data are cumulated in a shorter time period and made stronger storms. It is demonstrated that the difference between generated and observed cumulative distribution function curve of 4-hourly datasets is passed the K-S test criteria while in hourly and 10-minutes datasets the P-value should be employed to prove that their differences were reasonable. The results are encouraging considering the overestimation of generated high-resolution rainfall data.Keywords: DiMoN Tool, disaggregation, exceedance probability, Kolmogorov-Smirnov test, rainfall
Procedia PDF Downloads 2056894 Listening to the Voices of Syrian Refugee Women in Canada: An Ethnographic Insight into the Journey from Trauma to Adaptation
Authors: Areej Al-Hamad, Cheryl Forchuk, Abe Oudshoorn, Gerald Patrick Mckinley
Abstract:
Syrian refugee women face many obstacles when accessing health services in host countries that are influenced by various cultural, structural, and practical factors. This paper is based on critical ethnographic research undertaken in Canada to explore Syrian refugee women's migration experiences. Also, we aim to critically examine how the intersection of gender, trauma, violence and the political and economic conditions of Syrian refugee women shapes their everyday lives and health. The study also investigates the strategies and practices by which Syrian refugee women are currently addressing their healthcare needs and the models of care that are suggested for meeting their physical and mental health needs. Findings show that these women experienced constant worries, hardship, vulnerability, and intrusion of dignity. These experiences and challenges were aggravated by the structure of the Canadian social and health care system. This study offers a better understanding of the impact of migration and trauma on Syrian refugee women's roles, responsibilities, gender dynamics, and interaction with Ontario's healthcare system to improve interaction and outcomes. Health care models should address these challenges among Syrian refugee families in Canada.Keywords: Syrian refugee women, intersectionality, critical ethnography, migration
Procedia PDF Downloads 976893 Predisposition of Small Scale Businesses in Fagge, Kano State, Nigeria, Towards Profit and Loss Sharing Mode of Finance
Authors: Farida, M. Shehu, Shehu U. R. Aliyu
Abstract:
Access to finance has been recognized in the literature as one of the major impediments confronting small scale businesses (SSBs). This largely arises due to high lending rate, religious inclinations, collateral, etc. Islamic mode finance operates under Profit and Loss Sharing (PLS) arrangement between a borrower (business owner) and a lender (Islamic bank). This paper empirically assesses the determinants of predisposition of small scale business operators in Fagge local government area, Kano State, Nigeria, towards the PLS. Cross-sectional data from a sample of 291 small scale business operators was analyzed using logit and probit regression models. Empirical results reveal that while awareness and religion inclination positively drive interest towards the PLS, lending rate and collateral work against it. The paper, therefore, strongly recommends more advocacy campaigns and setting up of more Islamic banks in the country to cater for the financing and religious needs of SSBs in the study area.Keywords: Islamic finance, logit and probit models, profit and loss sharing small scale businesses, finance, commerce
Procedia PDF Downloads 3766892 Identification of Switched Reluctance Motor Parameters Using Exponential Swept-Sine Signal
Authors: Abdelmalek Ouannou, Adil Brouri, Laila Kadi, Tarik
Abstract:
Switched reluctance motor (SRM) has a major interest in a large domain as in electric vehicle driving because of its wide range of speed operation, high performances, low cost, and robustness to run under degraded conditions. The purpose of the paper is to develop a new analytical approach for modeling SRM parameters. Then, an identification scheme is proposed to obtain the SRM parameters. Since the SRM is featured by a highly nonlinear behavior, modeling these devices is difficult. Then, it is convenient to develop an accurate model describing the SRM. Furthermore, it is always operated in the magnetically saturated mode to maximize the energy transfer. Accordingly, it is shown that the SRM can be accurately described by a generalized polynomial Hammerstein model, i.e., the parallel connection of several Hammerstein models having polynomial nonlinearity. Presently an analytical identification method is developed using a chirp excitation signal. Afterward, the parameters of the obtained model have been determined using Finite Element Method analysis. Finally, in order to show the effectiveness of the proposed method, a comparison between the true and estimate models has been performed. The obtained results show that the output responses are very close.Keywords: switched reluctance motor, swept-sine signal, generalized Hammerstein model, nonlinear system
Procedia PDF Downloads 2416891 Convergence and Stability in Federated Learning with Adaptive Differential Privacy Preservation
Authors: Rizwan Rizwan
Abstract:
This paper provides an overview of Federated Learning (FL) and its application in enhancing data security, privacy, and efficiency. FL utilizes three distinct architectures to ensure privacy is never compromised. It involves training individual edge devices and aggregating their models on a server without sharing raw data. This approach not only provides secure models without data sharing but also offers a highly efficient privacy--preserving solution with improved security and data access. Also we discusses various frameworks used in FL and its integration with machine learning, deep learning, and data mining. In order to address the challenges of multi--party collaborative modeling scenarios, a brief review FL scheme combined with an adaptive gradient descent strategy and differential privacy mechanism. The adaptive learning rate algorithm adjusts the gradient descent process to avoid issues such as model overfitting and fluctuations, thereby enhancing modeling efficiency and performance in multi-party computation scenarios. Additionally, to cater to ultra-large-scale distributed secure computing, the research introduces a differential privacy mechanism that defends against various background knowledge attacks.Keywords: federated learning, differential privacy, gradient descent strategy, convergence, stability, threats
Procedia PDF Downloads 386890 Determinants of International Volatility Passthroughs of Agricultural Commodities: A Panel Analysis of Developing Countries
Authors: Tetsuji Tanaka, Jin Guo
Abstract:
The extant literature has not succeeded in uncovering the common determinants of price volatility transmissions of agricultural commodities from international to local markets, and further, has rarely investigated the role of self-sufficiency measures in the context of national food security. We analyzed various factors to determine the degree of price volatility transmissions of wheat, rice, and maize between world and domestic markets using GARCH models with dynamic conditional correlation (DCC) specifications and panel-feasible generalized least square models. We found that the grain autarky system has the potential to diminish volatility pass-throughs for three grain commodities. Furthermore, it was discovered that the substitutive commodity consumption behavior between maize and wheat buffers the volatility transmissions of both, but rice does not function as a transmission-relieving element, either for the volatilities of wheat or maize. The effectiveness of grain consumption substitution to insulate the pass-throughs from global markets is greater than that of cereal self-sufficiency. These implications are extremely beneficial for developing governments to protect their domestic food markets from uncertainty in foreign countries and as such, improves food security.Keywords: food security, GARCH, grain self-sufficiency, volatility transmission
Procedia PDF Downloads 1606889 Evaluation of UI for 3D Visualization-Based Building Information Applications
Authors: Monisha Pattanaik
Abstract:
In scenarios where users have to work with large amounts of hierarchical data structures combined with visualizations (For example, Construction 3d Models, Manufacturing equipment's models, Gantt charts, Building Plans), the data structures have a high density in terms of consisting multiple parent nodes up to 50 levels and their siblings to descendants, therefore convey an immediate feeling of complexity. With customers moving to consumer-grade enterprise software, it is crucial to make sophisticated features made available to touch devices or smaller screen sizes. This paper evaluates the UI component that allows users to scroll through all deep density levels using a slider overlay on top of the hierarchy table, performing several actions to focus on one set of objects at any point in time. This overlay component also solves the problem of excessive horizontal scrolling of the entire table on a fixed pane for a hierarchical table. This component can be customized to navigate through parents, only siblings, or a specific component of the hierarchy only. The evaluation of the UI component was done by End Users of application and Human-Computer Interaction (HCI) experts to test the UI component's usability with statistical results and recommendations to handle complex hierarchical data visualizations.Keywords: building information modeling, digital twin, navigation, UI component, user interface, usability, visualization
Procedia PDF Downloads 1426888 Advances in Mathematical Sciences: Unveiling the Power of Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid advancements in data collection, storage, and processing capabilities have led to an explosion of data in various domains. In this era of big data, mathematical sciences play a crucial role in uncovering valuable insights and driving informed decision-making through data analytics. The purpose of this abstract is to present the latest advances in mathematical sciences and their application in harnessing the power of data analytics. This abstract highlights the interdisciplinary nature of data analytics, showcasing how mathematics intersects with statistics, computer science, and other related fields to develop cutting-edge methodologies. It explores key mathematical techniques such as optimization, mathematical modeling, network analysis, and computational algorithms that underpin effective data analysis and interpretation. The abstract emphasizes the role of mathematical sciences in addressing real-world challenges across different sectors, including finance, healthcare, engineering, social sciences, and beyond. It showcases how mathematical models and statistical methods extract meaningful insights from complex datasets, facilitating evidence-based decision-making and driving innovation. Furthermore, the abstract emphasizes the importance of collaboration and knowledge exchange among researchers, practitioners, and industry professionals. It recognizes the value of interdisciplinary collaborations and the need to bridge the gap between academia and industry to ensure the practical application of mathematical advancements in data analytics. The abstract highlights the significance of ongoing research in mathematical sciences and its impact on data analytics. It emphasizes the need for continued exploration and innovation in mathematical methodologies to tackle emerging challenges in the era of big data and digital transformation. In summary, this abstract sheds light on the advances in mathematical sciences and their pivotal role in unveiling the power of data analytics. It calls for interdisciplinary collaboration, knowledge exchange, and ongoing research to further unlock the potential of mathematical methodologies in addressing complex problems and driving data-driven decision-making in various domains.Keywords: mathematical sciences, data analytics, advances, unveiling
Procedia PDF Downloads 976887 Study of the Influence of Eccentricity Due to Configuration and Materials on Seismic Response of a Typical Building
Authors: A. Latif Karimi, M. K. Shrimali
Abstract:
Seismic design is a critical stage in the process of design and construction of a building. It includes strategies for designing earthquake-resistant buildings to ensure health, safety, and security of the building occupants and assets. Hence, it becomes very important to understand the behavior of structural members precisely, for construction of buildings that can yield a better response to seismic forces. This paper investigates the behavior of a typical structure when subjected to ground motion. The corresponding mode shapes and modal frequencies are studied to interpret the response of an actual structure using different fabricated models and 3D visual models. In this study, three different structural configurations are subjected to horizontal ground motion, and the effect of “stiffness eccentricity” and placement of infill walls are checked to determine how each parameter contributes in a building’s response to dynamic forces. The deformation data from lab experiments and the analysis on SAP2000 software are reviewed to obtain the results. This study revealed that seismic response in a building can be improved by introducing higher deformation capacity in the building. Also, proper design of infill walls and maintaining a symmetrical configuration in a building are the key factors in building stability during the earthquake.Keywords: eccentricity, seismic response, mode shape, building configuration, building dynamics
Procedia PDF Downloads 2026886 Loan Repayment Prediction Using Machine Learning: Model Development, Django Web Integration and Cloud Deployment
Authors: Seun Mayowa Sunday
Abstract:
Loan prediction is one of the most significant and recognised fields of research in the banking, insurance, and the financial security industries. Some prediction systems on the market include the construction of static software. However, due to the fact that static software only operates with strictly regulated rules, they cannot aid customers beyond these limitations. Application of many machine learning (ML) techniques are required for loan prediction. Four separate machine learning models, random forest (RF), decision tree (DT), k-nearest neighbour (KNN), and logistic regression, are used to create the loan prediction model. Using the anaconda navigator and the required machine learning (ML) libraries, models are created and evaluated using the appropriate measuring metrics. From the finding, the random forest performs with the highest accuracy of 80.17% which was later implemented into the Django framework. For real-time testing, the web application is deployed on the Alibabacloud which is among the top 4 biggest cloud computing provider. Hence, to the best of our knowledge, this research will serve as the first academic paper which combines the model development and the Django framework, with the deployment into the Alibaba cloud computing application.Keywords: k-nearest neighbor, random forest, logistic regression, decision tree, django, cloud computing, alibaba cloud
Procedia PDF Downloads 1446885 Price Prediction Line, Investment Signals and Limit Conditions Applied for the German Financial Market
Authors: Cristian Păuna
Abstract:
In the first decades of the 21st century, in the electronic trading environment, algorithmic capital investments became the primary tool to make a profit by speculations in financial markets. A significant number of traders, private or institutional investors are participating in the capital markets every day using automated algorithms. The autonomous trading software is today a considerable part in the business intelligence system of any modern financial activity. The trading decisions and orders are made automatically by computers using different mathematical models. This paper will present one of these models called Price Prediction Line. A mathematical algorithm will be revealed to build a reliable trend line, which is the base for limit conditions and automated investment signals, the core for a computerized investment system. The paper will guide how to apply these tools to generate entry and exit investment signals, limit conditions to build a mathematical filter for the investment opportunities, and the methodology to integrate all of these in automated investment software. The paper will also present trading results obtained for the leading German financial market index with the presented methods to analyze and to compare different automated investment algorithms. It was found that a specific mathematical algorithm can be optimized and integrated into an automated trading system with good and sustained results for the leading German Market. Investment results will be compared in order to qualify the presented model. In conclusion, a 1:6.12 risk was obtained to reward ratio applying the trigonometric method to the DAX Deutscher Aktienindex on 24 months investment. These results are superior to those obtained with other similar models as this paper reveal. The general idea sustained by this paper is that the Price Prediction Line model presented is a reliable capital investment methodology that can be successfully applied to build an automated investment system with excellent results.Keywords: algorithmic trading, automated trading systems, high-frequency trading, DAX Deutscher Aktienindex
Procedia PDF Downloads 1346884 Large Scale Method to Assess the Seismic Vulnerability of Heritage Buidings: Modal Updating of Numerical Models and Vulnerability Curves
Authors: Claire Limoge Schraen, Philippe Gueguen, Cedric Giry, Cedric Desprez, Frédéric Ragueneau
Abstract:
Mediterranean area is characterized by numerous monumental or vernacular masonry structures illustrating old ways of build and live. Those precious buildings are often poorly documented, present complex shapes and loadings, and are protected by the States, leading to legal constraints. This area also presents a moderate to high seismic activity. Even moderate earthquakes can be magnified by local site effects and cause collapse or significant damage. Moreover the structural resistance of masonry buildings, especially when less famous or located in rural zones has been generally lowered by many factors: poor maintenance, unsuitable restoration, ambient pollution, previous earthquakes. Recent earthquakes prove that any damage to these architectural witnesses to our past is irreversible, leading to the necessity of acting preventively. This means providing preventive assessments for hundreds of structures with no or few documents. In this context we want to propose a general method, based on hierarchized numerical models, to provide preliminary structural diagnoses at a regional scale, indicating whether more precise investigations and models are necessary for each building. To this aim, we adapt different tools, being developed such as photogrammetry or to be created such as a preprocessor starting from pictures to build meshes for a FEM software, in order to allow dynamic studies of the buildings of the panel. We made an inventory of 198 baroque chapels and churches situated in the French Alps. Then their structural characteristics have been determined thanks field surveys and the MicMac photogrammetric software. Using structural criteria, we determined eight types of churches and seven types for chapels. We studied their dynamical behavior thanks to CAST3M, using EC8 spectrum and accelerogramms of the studied zone. This allowed us quantifying the effect of the needed simplifications in the most sensitive zones and choosing the most effective ones. We also proposed threshold criteria based on the observed damages visible in the in situ surveys, old pictures and Italian code. They are relevant in linear models. To validate the structural types, we made a vibratory measures campaign using vibratory ambient noise and velocimeters. It also allowed us validating this method on old masonry and identifying the modal characteristics of 20 churches. Then we proceeded to a dynamic identification between numerical and experimental modes. So we updated the linear models thanks to material and geometrical parameters, often unknown because of the complexity of the structures and materials. The numerically optimized values have been verified thanks to the measures we made on the masonry components in situ and in laboratory. We are now working on non-linear models redistributing the strains. So we validate the damage threshold criteria which we use to compute the vulnerability curves of each defined structural type. Our actual results show a good correlation between experimental and numerical data, validating the final modeling simplifications and the global method. We now plan to use non-linear analysis in the critical zones in order to test reinforcement solutions.Keywords: heritage structures, masonry numerical modeling, seismic vulnerability assessment, vibratory measure
Procedia PDF Downloads 4976883 Client Hacked Server
Authors: Bagul Abhijeet
Abstract:
Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring
Procedia PDF Downloads 2566882 Solid Particles Transport and Deposition Prediction in a Turbulent Impinging Jet Using the Lattice Boltzmann Method and a Probabilistic Model on GPU
Authors: Ali Abdul Kadhim, Fue Lien
Abstract:
Solid particle distribution on an impingement surface has been simulated utilizing a graphical processing unit (GPU). In-house computational fluid dynamics (CFD) code has been developed to investigate a 3D turbulent impinging jet using the lattice Boltzmann method (LBM) in conjunction with large eddy simulation (LES) and the multiple relaxation time (MRT) models. This paper proposed an improvement in the LBM-cellular automata (LBM-CA) probabilistic method. In the current model, the fluid flow utilizes the D3Q19 lattice, while the particle model employs the D3Q27 lattice. The particle numbers are defined at the same regular LBM nodes, and transport of particles from one node to its neighboring nodes are determined in accordance with the particle bulk density and velocity by considering all the external forces. The previous models distribute particles at each time step without considering the local velocity and the number of particles at each node. The present model overcomes the deficiencies of the previous LBM-CA models and, therefore, can better capture the dynamic interaction between particles and the surrounding turbulent flow field. Despite the increasing popularity of LBM-MRT-CA model in simulating complex multiphase fluid flows, this approach is still expensive in term of memory size and computational time required to perform 3D simulations. To improve the throughput of each simulation, a single GeForce GTX TITAN X GPU is used in the present work. The CUDA parallel programming platform and the CuRAND library are utilized to form an efficient LBM-CA algorithm. The methodology was first validated against a benchmark test case involving particle deposition on a square cylinder confined in a duct. The flow was unsteady and laminar at Re=200 (Re is the Reynolds number), and simulations were conducted for different Stokes numbers. The present LBM solutions agree well with other results available in the open literature. The GPU code was then used to simulate the particle transport and deposition in a turbulent impinging jet at Re=10,000. The simulations were conducted for L/D=2,4 and 6, where L is the nozzle-to-surface distance and D is the jet diameter. The effect of changing the Stokes number on the particle deposition profile was studied at different L/D ratios. For comparative studies, another in-house serial CPU code was also developed, coupling LBM with the classical Lagrangian particle dispersion model. Agreement between results obtained with LBM-CA and LBM-Lagrangian models and the experimental data is generally good. The present GPU approach achieves a speedup ratio of about 350 against the serial code running on a single CPU.Keywords: CUDA, GPU parallel programming, LES, lattice Boltzmann method, MRT, multi-phase flow, probabilistic model
Procedia PDF Downloads 2106881 Improved Elastoplastic Bounding Surface Model for the Mathematical Modeling of Geomaterials
Authors: Andres Nieto-Leal, Victor N. Kaliakin, Tania P. Molina
Abstract:
The nature of most engineering materials is quite complex. It is, therefore, difficult to devise a general mathematical model that will cover all possible ranges and types of excitation and behavior of a given material. As a result, the development of mathematical models is based upon simplifying assumptions regarding material behavior. Such simplifications result in some material idealization; for example, one of the simplest material idealization is to assume that the material behavior obeys the elasticity. However, soils are nonhomogeneous, anisotropic, path-dependent materials that exhibit nonlinear stress-strain relationships, changes in volume under shear, dilatancy, as well as time-, rate- and temperature-dependent behavior. Over the years, many constitutive models, possessing different levels of sophistication, have been developed to simulate the behavior geomaterials, particularly cohesive soils. Early in the development of constitutive models, it became evident that elastic or standard elastoplastic formulations, employing purely isotropic hardening and predicated in the existence of a yield surface surrounding a purely elastic domain, were incapable of realistically simulating the behavior of geomaterials. Accordingly, more sophisticated constitutive models have been developed; for example, the bounding surface elastoplasticity. The essence of the bounding surface concept is the hypothesis that plastic deformations can occur for stress states either within or on the bounding surface. Thus, unlike classical yield surface elastoplasticity, the plastic states are not restricted only to those lying on a surface. Elastoplastic bounding surface models have been improved; however, there is still need to improve their capabilities in simulating the response of anisotropically consolidated cohesive soils, especially the response in extension tests. Thus, in this work an improved constitutive model that can more accurately predict diverse stress-strain phenomena exhibited by cohesive soils was developed. Particularly, an improved rotational hardening rule that better simulate the response of cohesive soils in extension. The generalized definition of the bounding surface model provides a convenient and elegant framework for unifying various previous versions of the model for anisotropically consolidated cohesive soils. The Generalized Bounding Surface Model for cohesive soils is a fully three-dimensional, time-dependent model that accounts for both inherent and stress induced anisotropy employing a non-associative flow rule. The model numerical implementation in a computer code followed an adaptive multistep integration scheme in conjunction with local iteration and radial return. The one-step trapezoidal rule was used to get the stiffness matrix that defines the relationship between the stress increment and the strain increment. After testing the model in simulating the response of cohesive soils through extensive comparisons of model simulations to experimental data, it has been shown to give quite good simulations. The new model successfully simulates the response of different cohesive soils; for example, Cardiff Kaolin, Spestone Kaolin, and Lower Cromer Till. The simulated undrained stress paths, stress-strain response, and excess pore pressures are in very good agreement with the experimental values, especially in extension.Keywords: bounding surface elastoplasticity, cohesive soils, constitutive model, modeling of geomaterials
Procedia PDF Downloads 3176880 Multi-Agent System Based Distributed Voltage Control in Distribution Systems
Authors: A. Arshad, M. Lehtonen. M. Humayun
Abstract:
With the increasing Distributed Generation (DG) penetration, distribution systems are advancing towards the smart grid technology for least latency in tackling voltage control problem in a distributed manner. This paper proposes a Multi-agent based distributed voltage level control. In this method a flat architecture of agents is used and agents involved in the whole controlling procedure are On Load Tap Changer Agent (OLTCA), Static VAR Compensator Agent (SVCA), and the agents associated with DGs and loads at their locations. The objectives of the proposed voltage control model are to minimize network losses and DG curtailments while maintaining voltage value within statutory limits as close as possible to the nominal. The total loss cost is the sum of network losses cost, DG curtailment costs, and voltage damage cost (which is based on penalty function implementation). The total cost is iteratively calculated for various stricter limits by plotting voltage damage cost and losses cost against varying voltage limit band. The method provides the optimal limits closer to nominal value with minimum total loss cost. In order to achieve the objective of voltage control, the whole network is divided into multiple control regions; downstream from the controlling device. The OLTCA behaves as a supervisory agent and performs all the optimizations. At first, a token is generated by OLTCA on each time step and it transfers from node to node until the node with voltage violation is detected. Upon detection of such a node, the token grants permission to Load Agent (LA) for initiation of possible remedial actions. LA will contact the respective controlling devices dependent on the vicinity of the violated node. If the violated node does not lie in the vicinity of the controller or the controlling capabilities of all the downstream control devices are at their limits then OLTC is considered as a last resort. For a realistic study, simulations are performed for a typical Finnish residential medium-voltage distribution system using Matlab ®. These simulations are executed for two cases; simple Distributed Voltage Control (DVC) and DVC with optimized loss cost (DVC + Penalty Function). A sensitivity analysis is performed based on DG penetration. The results indicate that costs of losses and DG curtailments are directly proportional to the DG penetration, while in case 2 there is a significant reduction in total loss. For lower DG penetration, losses are reduced more or less 50%, while for higher DG penetration, loss reduction is not very significant. Another observation is that the newer stricter limits calculated by cost optimization moves towards the statutory limits of ±10% of the nominal with the increasing DG penetration as for 25, 45 and 65% limits calculated are ±5, ±6.25 and 8.75% respectively. Observed results conclude that the novel voltage control algorithm proposed in case 1 is able to deal with the voltage control problem instantly but with higher losses. In contrast, case 2 make sure to reduce the network losses through proposed iterative method of loss cost optimization by OLTCA, slowly with time.Keywords: distributed voltage control, distribution system, multi-agent systems, smart grids
Procedia PDF Downloads 3156879 Environmental Effects on Energy Consumption of Smart Grid Consumers
Authors: S. M. Ali, A. Salam Khan, A. U. Khan, M. Tariq, M. S. Hussain, B. A. Abbasi, I. Hussain, U. Farid
Abstract:
Environment and surrounding plays a pivotal rule in structuring life-style of the consumers. Living standards intern effect the energy consumption of the consumers. In smart grid paradigm, climate drifts, weather parameter and green environmental directly relates to the energy profiles of the various consumers, such as residential, commercial and industrial. Considering above factors helps policy in shaping utility load curves and optimal management of demand and supply. Thus, there is a pressing need to develop correlation models of load and weather parameters and critical analysis of the factors effecting energy profiles of smart grid consumers. In this paper, we elaborated various environment and weather parameter factors effecting demand of consumers. Moreover, we developed correlation models, such as Pearson, Spearman, and Kendall, an inter-relation between dependent (load) parameter and independent (weather) parameters. Furthermore, we validated our discussion with real-time data of Texas State. The numerical simulations proved the effective relation of climatic drifts with energy consumption of smart grid consumers.Keywords: climatic drifts, correlation analysis, energy consumption, smart grid, weather parameter
Procedia PDF Downloads 3796878 Analyzing the Commentator Network Within the French YouTube Environment
Authors: Kurt Maxwell Kusterer, Sylvain Mignot, Annick Vignes
Abstract:
To our best knowledge YouTube is the largest video hosting platform in the world. A high number of creators, viewers, subscribers and commentators act in this specific eco-system which generates huge sums of money. Views, subscribers, and comments help to increase the popularity of content creators. The most popular creators are sponsored by brands and participate in marketing campaigns. For a few of them, this becomes a financially rewarding profession. This is made possible through the YouTube Partner Program, which shares revenue among creators based on their popularity. We believe that the role of comments in increasing the popularity is to be emphasized. In what follows, YouTube is considered as a bilateral network between the videos and the commentators. Analyzing a detailed data set focused on French YouTubers, we consider each comment as a link between a commentator and a video. Our research question asks what are the predominant features of a video which give it the highest probability to be commented on. Following on from this question, how can we use these features to predict the action of the agent in commenting one video instead of another, considering the characteristics of the commentators, videos, topics, channels, and recommendations. We expect to see that the videos of more popular channels generate higher viewer engagement and thus are more frequently commented. The interest lies in discovering features which have not classically been considered as markers for popularity on the platform. A quick view of our data set shows that 96% of the commentators comment only once on a certain video. Thus, we study a non-weighted bipartite network between commentators and videos built on the sub-sample of 96% of unique comments. A link exists between two nodes when a commentator makes a comment on a video. We run an Exponential Random Graph Model (ERGM) approach to evaluate which characteristics influence the probability of commenting a video. The creation of a link will be explained in terms of common video features, such as duration, quality, number of likes, number of views, etc. Our data is relevant for the period of 2020-2021 and focuses on the French YouTube environment. From this set of 391 588 videos, we extract the channels which can be monetized according to YouTube regulations (channels with at least 1000 subscribers and more than 4000 hours of viewing time during the last twelve months).In the end, we have a data set of 128 462 videos which consist of 4093 channels. Based on these videos, we have a data set of 1 032 771 unique commentators, with a mean of 2 comments per a commentator, a minimum of 1 comment each, and a maximum of 584 comments.Keywords: YouTube, social networks, economics, consumer behaviour
Procedia PDF Downloads 72