Search results for: information systems success model
31051 Prediction Compressive Strength of Self-Compacting Concrete Containing Fly Ash Using Fuzzy Logic Inference System
Authors: Belalia Douma Omar, Bakhta Boukhatem, Mohamed Ghrici
Abstract:
Self-compacting concrete (SCC) developed in Japan in the late 80s has enabled the construction industry to reduce demand on the resources, improve the work condition and also reduce the impact of environment by elimination of the need for compaction. Fuzzy logic (FL) approaches has recently been used to model some of the human activities in many areas of civil engineering applications. Especially from these systems in the model experimental studies, very good results have been obtained. In the present study, a model for predicting compressive strength of SCC containing various proportions of fly ash, as partial replacement of cement has been developed by using Adaptive Neuro-Fuzzy Inference System (ANFIS). For the purpose of building this model, a database of experimental data were gathered from the literature and used for training and testing the model. The used data as the inputs of fuzzy logic models are arranged in a format of five parameters that cover the total binder content, fly ash replacement percentage, water content, super plasticizer and age of specimens. The training and testing results in the fuzzy logic model have shown a strong potential for predicting the compressive strength of SCC containing fly ash in the considered range.Keywords: self-compacting concrete, fly ash, strength prediction, fuzzy logic
Procedia PDF Downloads 33531050 Prediction of Gully Erosion with Stochastic Modeling by using Geographic Information System and Remote Sensing Data in North of Iran
Authors: Reza Zakerinejad
Abstract:
Gully erosion is a serious problem that threading the sustainability of agricultural area and rangeland and water in a large part of Iran. This type of water erosion is the main source of sedimentation in many catchment areas in the north of Iran. Since in many national assessment approaches just qualitative models were applied the aim of this study is to predict the spatial distribution of gully erosion processes by means of detail terrain analysis and GIS -based logistic regression in the loess deposition in a case study in the Golestan Province. This study the DEM with 25 meter result ion from ASTER data has been used. The Landsat ETM data have been used to mapping of land use. The TreeNet model as a stochastic modeling was applied to prediction the susceptible area for gully erosion. In this model ROC we have set 20 % of data as learning and 20 % as learning data. Therefore, applying the GIS and satellite image analysis techniques has been used to derive the input information for these stochastic models. The result of this study showed a high accurate map of potential for gully erosion.Keywords: TreeNet model, terrain analysis, Golestan Province, Iran
Procedia PDF Downloads 53531049 Economic Development Process: A Compartmental Analysis of a Model with Two Delays
Authors: Amadou Banda Ndione, Charles Awono Onana
Abstract:
In this paper the compartmental approach is applied to build a macroeconomic model characterized by countries. We consider a total of N countries that are subdivided into three compartments according to their economic status: D(t) denotes the compartment of developing countries at time t, E(t) stands for the compartment of emerging countries at time t while A(t) represents advanced countries at time t. The model describes the process of economic development and includes the notion of openness through collaborations between countries. Two delays appear in this model to describe the average time necessary for collaborations between countries to become efficient for their development process. Our model represents the different stages of development. It further gives the conditions under which a country can change its economic status and demonstrates the short-term positive effect of openness on economic growth. In addition, we investigate bifurcation by considering the delay as a bifurcation parameter and examine the onset and termination of Hopf bifurcations from a positive equilibrium. Numerical simulations are provided in order to illustrate the theoretical part and to support discussion.Keywords: compartmental systems, delayed dynamical system, economic development, fiscal policy, hopf bifurcation
Procedia PDF Downloads 13731048 Agile Software Effort Estimation Using Regression Techniques
Authors: Mikiyas Adugna
Abstract:
Effort estimation is among the activities carried out in software development processes. An accurate model of estimation leads to project success. The method of agile effort estimation is a complex task because of the dynamic nature of software development. Researchers are still conducting studies on agile effort estimation to enhance prediction accuracy. Due to these reasons, we investigated and proposed a model on LASSO and Elastic Net regression to enhance estimation accuracy. The proposed model has major components: preprocessing, train-test split, training with default parameters, and cross-validation. During the preprocessing phase, the entire dataset is normalized. After normalization, a train-test split is performed on the dataset, setting training at 80% and testing set to 20%. We chose two different phases for training the two algorithms (Elastic Net and LASSO) regression following the train-test-split. In the first phase, the two algorithms are trained using their default parameters and evaluated on the testing data. In the second phase, the grid search technique (the grid is used to search for tuning and select optimum parameters) and 5-fold cross-validation to get the final trained model. Finally, the final trained model is evaluated using the testing set. The experimental work is applied to the agile story point dataset of 21 software projects collected from six firms. The results show that both Elastic Net and LASSO regression outperformed the compared ones. Compared to the proposed algorithms, LASSO regression achieved better predictive performance and has acquired PRED (8%) and PRED (25%) results of 100.0, MMRE of 0.0491, MMER of 0.0551, MdMRE of 0.0593, MdMER of 0.063, and MSE of 0.0007. The result implies LASSO regression algorithm trained model is the most acceptable, and higher estimation performance exists in the literature.Keywords: agile software development, effort estimation, elastic net regression, LASSO
Procedia PDF Downloads 7131047 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis
Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee
Abstract:
In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences
Procedia PDF Downloads 74331046 An Interactive Institutional Framework for Evolution of Enterprise Technological Innovation Capabilities System: A Complex Adaptive Systems Approach
Authors: Sohail Ahmed, Ke Xing
Abstract:
This research theoretically explored the evolution mechanism of enterprise technological innovation capability system (ETICS) from the perspective of complex adaptive systems (CAS). This research proposed an analytical framework for ETICS, its concepts, and theory by integrating CAS methodology into the management of the technological innovation capability of enterprises and discusses how to use the principles of complexity to analyze the composition, evolution, and realization of the technological innovation capabilities in complex dynamic environments. This paper introduces the concept and interaction of multi-agent, the theoretical background of CAS, and summarizes the sources of technological innovation, the elements of each subject, and the main clusters of adaptive interactions and innovation activities. The concept of multi-agents is applied through the linkages of enterprises, research institutions, and government agencies with the leading enterprises in industrial settings. The study was exploratory and based on CAS theory. Theoretical model is built by considering technological and innovation literature from foundational to state of the art projects of technological enterprises. On this basis, the theoretical model is developed to measure the evolution mechanism of the enterprise's technological innovation capability system. This paper concludes that the main characteristics for evolution in technological systems are based on the enterprise’s research and development personnel, investments in technological processes, and innovation resources are responsible for the evolution of enterprise technological innovation performance. The research specifically enriched the application process of technological innovation in institutional networks related to enterprises.Keywords: complex adaptive system, echo model, enterprise technological innovation capability system, research institutions, multi-agents
Procedia PDF Downloads 13731045 Structural Invertibility and Optimal Sensor Node Placement for Error and Input Reconstruction in Dynamic Systems
Authors: Maik Kschischo, Dominik Kahl, Philipp Wendland, Andreas Weber
Abstract:
Understanding and modelling of real-world complex dynamic systems in biology, engineering and other fields is often made difficult by incomplete knowledge about the interactions between systems states and by unknown disturbances to the system. In fact, most real-world dynamic networks are open systems receiving unknown inputs from their environment. To understand a system and to estimate the state dynamics, these inputs need to be reconstructed from output measurements. Reconstructing the input of a dynamic system from its measured outputs is an ill-posed problem if only a limited number of states is directly measurable. A first requirement for solving this problem is the invertibility of the input-output map. In our work, we exploit the fact that invertibility of a dynamic system is a structural property, which depends only on the network topology. Therefore, it is possible to check for invertibility using a structural invertibility algorithm which counts the number of node disjoint paths linking inputs and outputs. The algorithm is efficient enough, even for large networks up to a million nodes. To understand structural features influencing the invertibility of a complex dynamic network, we analyze synthetic and real networks using the structural invertibility algorithm. We find that invertibility largely depends on the degree distribution and that dense random networks are easier to invert than sparse inhomogeneous networks. We show that real networks are often very difficult to invert unless the sensor nodes are carefully chosen. To overcome this problem, we present a sensor node placement algorithm to achieve invertibility with a minimum set of measured states. This greedy algorithm is very fast and also guaranteed to find an optimal sensor node-set if it exists. Our results provide a practical approach to experimental design for open, dynamic systems. Since invertibility is a necessary condition for unknown input observers and data assimilation filters to work, it can be used as a preprocessing step to check, whether these input reconstruction algorithms can be successful. If not, we can suggest additional measurements providing sufficient information for input reconstruction. Invertibility is also important for systems design and model building. Dynamic models are always incomplete, and synthetic systems act in an environment, where they receive inputs or even attack signals from their exterior. Being able to monitor these inputs is an important design requirement, which can be achieved by our algorithms for invertibility analysis and sensor node placement.Keywords: data-driven dynamic systems, inversion of dynamic systems, observability, experimental design, sensor node placement
Procedia PDF Downloads 15031044 Patient Tracking Challenges During Disasters and Emergencies
Authors: Mohammad H. Yarmohammadian, Reza Safdari, Mahmoud Keyvanara, Nahid Tavakoli
Abstract:
One of the greatest challenges in disaster and emergencies is patient tracking. The concept of tracking has different denotations. One of the meanings refers to tracking patients’ physical locations and the other meaning refers to tracking patients ‘medical needs during emergency services. The main goal of patient tracking is to provide patient safety during disaster and emergencies and manage the flow of patient and information in different locations. In most of cases, there are not sufficient and accurate data regarding the number of injuries, medical conditions and their accommodation and transference. The objective of the present study is to survey on patient tracking issue in natural disaster and emergencies. Methods: This was a narrative study in which the population was E-Journals and the electronic database such as PubMed, Proquest, Science direct, Elsevier, etc. Data was gathered by Extraction Form. All data were analyzed via content analysis. Results: In many countries there is no appropriate and rapid method for tracking patients and transferring victims after the occurrence of incidents. The absence of reliable data of patients’ transference and accommodation, even in the initial hours and days after the occurrence of disasters, and coordination for appropriate resource allocation, have faced challenges for evaluating needs and services challenges. Currently, most of emergency services are based on paper systems, while these systems do not act appropriately in great disasters and incidents and this issue causes information loss. Conclusion: Patient tracking system should update the location of patients or evacuees and information related to their states. Patients’ information should be accessible for authorized users to continue their treatment, accommodation and transference. Also it should include timely information of patients’ location as soon as they arrive somewhere and leave therein such a way that health care professionals can be able to provide patients’ proper medical treatment.Keywords: patient tracking, challenges, disaster, emergency
Procedia PDF Downloads 30431043 Mapping the Relationship between Elements of Urban Morphology Density of Crime
Authors: Fabio Salvador Aparecido Santos, Spencer Chainey, Richard Wortley
Abstract:
Urban morphology can be understood as the study of the physical form of cities through its elements. Crime, at this turn, can be oversimplified as an action that breaks the rules established in a certain society. This study involves these two subjects through the relationship between elements of urban morphology and density of crime occurrences. We consider that there is a research gap about the influence of urban features on crime occurrences using statistic methods and mapping techniques on Geographic Information Systems. The investigation will comprehend three main phases. The first phase involves examining how theoretical principles associated with urban morphology can be viewed in terms of their influence on crime patterns. The second phase involves the development of tools to be used to model elements of urban morphology, and measure the relationship between these urban morphological elements and patterns of crime. The third phase involves determining the extent to which elements of the urban environment can contribute to crime reduction. Understanding the relationship between urban morphology and crime patterns in a Latin American context will help highlight the influence urban planning has on the crime problems that emerge in these settings, and how effectively urban planning can contribute to reducing crime.Keywords: Agent-based Modelling, Environmental Criminology, Geographic Information System, Urban Morphology
Procedia PDF Downloads 13631042 Strategies for Success: Strategic Thinking’s Critical Role in Entrepreneurial
Authors: Silvia Rahmita
Abstract:
Entrepreneurial success is crucial for economic growth, competitiveness, and job creation, yet many entrepreneurs face failure due to various challenges. This paper explores the critical role of strategic thinking in mitigating entrepreneurial failure. Entrepreneurial competencies—encompassing knowledge, skills, and traits—are essential for creating and growing ventures. Despite these competencies, numerous entrepreneurs fail due to poor management, inadequate support, and ineffective policies. The paper categorizes entrepreneurial failures into financial, operational, market, product or service, strategic, leadership, legal, human capital, technological, and environmental failures. Each failure type can be addressed through strategic thinking, which involves foresight, balancing short-term and long-term goals, and hypothesis-driven processes. By integrating strategic thinking into their approach, entrepreneurs can enhance risk management, adapt to market changes, and sustain growth. This process involves setting clear goals, innovating products, and maintaining a competitive edge. Ultimately, strategic thinking provides a framework for proactive planning, adaptation, and continuous improvement, reducing the likelihood of failure and ensuring long-term success. Entrepreneurs who prioritize strategic thinking are better equipped to navigate the complexities of the business environment and achieve sustainable growth.Keywords: entrepreneurial failure, strategic thinking, risk management, business failure
Procedia PDF Downloads 4031041 Satellite LiDAR-Based Digital Terrain Model Correction using Gaussian Process Regression
Authors: Keisuke Takahata, Hiroshi Suetsugu
Abstract:
Forest height is an important parameter for forest biomass estimation, and precise elevation data is essential for accurate forest height estimation. There are several globally or nationally available digital elevation models (DEMs) like SRTM and ASTER. However, its accuracy is reported to be low particularly in mountainous areas where there are closed canopy or steep slope. Recently, space-borne LiDAR, such as the Global Ecosystem Dynamics Investigation (GEDI), have started to provide sparse but accurate ground elevation and canopy height estimates. Several studies have reported the high degree of accuracy in their elevation products on their exact footprints, while it is not clear how this sparse information can be used for wider area. In this study, we developed a digital terrain model correction algorithm by spatially interpolating the difference between existing DEMs and GEDI elevation products by using Gaussian Process (GP) regression model. The result shows that our GP-based methodology can reduce the mean bias of the elevation data from 3.7m to 0.3m when we use airborne LiDAR-derived elevation information as ground truth. Our algorithm is also capable of quantifying the elevation data uncertainty, which is critical requirement for biomass inventory. Upcoming satellite-LiDAR missions, like MOLI (Multi-footprint Observation Lidar and Imager), are expected to contribute to the more accurate digital terrain model generation.Keywords: digital terrain model, satellite LiDAR, gaussian processes, uncertainty quantification
Procedia PDF Downloads 18331040 eTransformation Framework for the Cognitive Systems
Authors: Ana Hol
Abstract:
Digital systems are in the cognitive wave of the eTransformations and are now extensively aimed at meeting the individuals’ demands, both those of customers requiring services and those of service providers. It is also apparent that successful future systems will not just simply open doors to the traditional owners/users to offer and receive services such as Uber for example does today, but will in the future require more customized and cognitively enabled infrastructures that will be responsive to the system user’s needs. To be able to identify what is required for such systems, this research reviews the historical and the current effects of the eTransformation process by studying: 1. eTransitions of company websites and mobile applications, 2. Emergence of new sheared economy business models as Uber and, 3. New requirements for demand driven, cognitive systems capable of learning and just in time decision making. Based on the analysis, this study proposes a Cognitive eTransformation Framework capable of guiding implementations of new responsive and user aware systems.Keywords: system implementations, AI supported systems, cognitive systems, eTransformation
Procedia PDF Downloads 23831039 A Safety Analysis Method for Multi-Agent Systems
Authors: Ching Louis Liu, Edmund Kazmierczak, Tim Miller
Abstract:
Safety analysis for multi-agent systems is complicated by the, potentially nonlinear, interactions between agents. This paper proposes a method for analyzing the safety of multi-agent systems by explicitly focusing on interactions and the accident data of systems that are similar in structure and function to the system being analyzed. The method creates a Bayesian network using the accident data from similar systems. A feature of our method is that the events in accident data are labeled with HAZOP guide words. Our method uses an Ontology to abstract away from the details of a multi-agent implementation. Using the ontology, our methods then constructs an “Interaction Map,” a graphical representation of the patterns of interactions between agents and other artifacts. Interaction maps combined with statistical data from accidents and the HAZOP classifications of events can be converted into a Bayesian Network. Bayesian networks allow designers to explore “what it” scenarios and make design trade-offs that maintain safety. We show how to use the Bayesian networks, and the interaction maps to improve multi-agent system designs.Keywords: multi-agent system, safety analysis, safety model, integration map
Procedia PDF Downloads 41731038 Radar Fault Diagnosis Strategy Based on Deep Learning
Authors: Bin Feng, Zhulin Zong
Abstract:
Radar systems are critical in the modern military, aviation, and maritime operations, and their proper functioning is essential for the success of these operations. However, due to the complexity and sensitivity of radar systems, they are susceptible to various faults that can significantly affect their performance. Traditional radar fault diagnosis strategies rely on expert knowledge and rule-based approaches, which are often limited in effectiveness and require a lot of time and resources. Deep learning has recently emerged as a promising approach for fault diagnosis due to its ability to learn features and patterns from large amounts of data automatically. In this paper, we propose a radar fault diagnosis strategy based on deep learning that can accurately identify and classify faults in radar systems. Our approach uses convolutional neural networks (CNN) to extract features from radar signals and fault classify the features. The proposed strategy is trained and validated on a dataset of measured radar signals with various types of faults. The results show that it achieves high accuracy in fault diagnosis. To further evaluate the effectiveness of the proposed strategy, we compare it with traditional rule-based approaches and other machine learning-based methods, including decision trees, support vector machines (SVMs), and random forests. The results demonstrate that our deep learning-based approach outperforms the traditional approaches in terms of accuracy and efficiency. Finally, we discuss the potential applications and limitations of the proposed strategy, as well as future research directions. Our study highlights the importance and potential of deep learning for radar fault diagnosis. It suggests that it can be a valuable tool for improving the performance and reliability of radar systems. In summary, this paper presents a radar fault diagnosis strategy based on deep learning that achieves high accuracy and efficiency in identifying and classifying faults in radar systems. The proposed strategy has significant potential for practical applications and can pave the way for further research.Keywords: radar system, fault diagnosis, deep learning, radar fault
Procedia PDF Downloads 9031037 Online Battery Equivalent Circuit Model Estimation on Continuous-Time Domain Using Linear Integral Filter Method
Authors: Cheng Zhang, James Marco, Walid Allafi, Truong Q. Dinh, W. D. Widanage
Abstract:
Equivalent circuit models (ECMs) are widely used in battery management systems in electric vehicles and other battery energy storage systems. The battery dynamics and the model parameters vary under different working conditions, such as different temperature and state of charge (SOC) levels, and therefore online parameter identification can improve the modelling accuracy. This paper presents a way of online ECM parameter identification using a continuous time (CT) estimation method. The CT estimation method has several advantages over discrete time (DT) estimation methods for ECM parameter identification due to the widely separated battery dynamic modes and fast sampling. The presented method can be used for online SOC estimation. Test data are collected using a lithium ion cell, and the experimental results show that the presented CT method achieves better modelling accuracy compared with the conventional DT recursive least square method. The effectiveness of the presented method for online SOC estimation is also verified on test data.Keywords: electric circuit model, continuous time domain estimation, linear integral filter method, parameter and SOC estimation, recursive least square
Procedia PDF Downloads 38331036 Field-observed Thermal Fractures during Reinjection and Its Numerical Simulation
Authors: Wen Luo, Phil J. Vardon, Anne-Catherine Dieudonne
Abstract:
One key process that partly controls the success of geothermal projects is fluid reinjection, which benefits in dealing with waste water, maintaining reservoir pressure, and supplying heat-exchange media, etc. Thus, sustaining the injectivity is of great importance for the efficiency and sustainability of geothermal production. However, the injectivity is sensitive to the reinjection process. Field experiences have illustrated that the injectivity can be damaged or improved. In this paper, the focus is on how the injectivity is improved. Since the injection pressure is far below the formation fracture pressure, hydraulic fracturing cannot be the mechanism contributing to the increase in injectivity. Instead, thermal stimulation has been identified as the main contributor to improving the injectivity. For low-enthalpy geothermal reservoirs, which are not fracture-controlled, thermal fracturing, instead of thermal shearing, is expected to be the mechanism for increasing injectivity. In this paper, field data from the sedimentary low-enthalpy geothermal reservoirs in the Netherlands were analysed to show the occurrence of thermal fracturing due to the cooling shock during reinjection. Injection data were collected and compared to show the effects of the thermal fractures on injectivity. Then, a thermo-hydro-mechanical (THM) model for the near field formation was developed and solved by finite element method to simulate the observed thermal fractures. It was then compared with the HM model, decomposed from the THM model, to illustrate the thermal effects on thermal fracturing. Finally, the effects of operational parameters, i.e. injection temperature and pressure, on the changes in injectivity were studied on the basis of the THM model. The field data analysis and simulation results illustrate that the thermal fracturing occurred during reinjection and contributed to the increase in injectivity. The injection temperature was identified as a key parameter that contributes to thermal fracturing.Keywords: injectivity, reinjection, thermal fracturing, thermo-hydro-mechanical model
Procedia PDF Downloads 21731035 The Economic Implications of Cryptocurrency and Its Potential to Disrupt Traditional Financial Systems as a Store of Value
Authors: G. L. Rithika, Arvind B. S., Akash R., Ananda Vinayak, Hema M. S.
Abstract:
Cryptocurrencies were first launched in the year 2009 and have been a great asset to own. Cryptocurrencies are a representation of a completely distinct decentralization model for money. They also contribute to the elimination of currency monopolies and the liberation of money from control. The fact that no government agency can determine a coin's value or flow is what cryptocurrency advocates believe makes them safe and secure. The aim of this paper is to analyze the economic implications of cryptocurrency and how it would disrupt traditional financial systems. This paper analyses the growth of Cryptocurrency over the years and the potential threats of cryptocurrency to financial systems. Our analysis shows that although the DeFi design, like the traditional financial system, may have the ability to lower transaction costs, there are multiple layers where rents might build up because of endogenous competition limitations. The permissionless and anonymous design of DeFi poses issues for ensuring tax compliance, anti-money laundering laws and regulations, and preventing financial misconduct.Keywords: cryptocurrencies, bitcoin, blockchain technology, traditional financial systems, decentralisation, regulatory framework
Procedia PDF Downloads 5031034 Secure E-Voting Using Blockchain Technology
Authors: Barkha Ramteke, Sonali Ridhorkar
Abstract:
An election is an important event in all countries. Traditional voting has several drawbacks, including the expense of time and effort required for tallying and counting results, the cost of papers, arrangements, and everything else required to complete a voting process. Many countries are now considering online e-voting systems, but the traditional e-voting systems suffer a lack of trust. It is not known if a vote is counted correctly, tampered or not. A lack of transparency means that the voter has no assurance that his or her vote will be counted as they voted in elections. Electronic voting systems are increasingly using blockchain technology as an underlying storage mechanism to make the voting process more transparent and assure data immutability as blockchain technology grows in popularity. The transparent feature, on the other hand, may reveal critical information about applicants because all system users have the same entitlement to their data. Furthermore, because of blockchain's pseudo-anonymity, voters' privacy will be revealed, and third parties involved in the voting process, such as registration institutions, will be able to tamper with data. To overcome these difficulties, we apply Ethereum smart contracts into blockchain-based voting systems.Keywords: blockchain, AMV chain, electronic voting, decentralized
Procedia PDF Downloads 13831033 Fault Diagnosis in Confined Systems
Authors: Nesrine Berber, Hafid Haffaf, Abdel Madjid Meghabar
Abstract:
In the last decade, technology has continued to grow and has changed the structure of our society. Today, new technologies including the information and communication (ICT) play a main role which importance continues to grow, now it's become indispensable to the economic, social and cultural. Thus, ICT technology has proven to be as a promising intervention in the area of road transport. The supervision model of class of train of intelligent and autonomous vehicles leads us to give some defintions about IAV and the different technologies used for communication between them. Our aim in this work is to present an hypergraph modeling a class of train of Intelligent and Autonomous Vehicles (IAV).Keywords: intelligent transportation system, intelligent autonomous vehicles, Ad Hoc network, wireless technologies, hypergraph modeling, supervision
Procedia PDF Downloads 54631032 Physical Theory for One-Dimensional Correlated Electron Systems
Authors: Nelson Nenuwe
Abstract:
The behavior of interacting electrons in one dimension was studied by calculating correlation functions and critical exponents at zero and external magnetic fields for arbitrary band filling. The technique employed in this study is based on the conformal field theory (CFT). The charge and spin degrees of freedom are separated, and described by two independent conformal theories. A detailed comparison of the t-J model with the repulsive Hubbard model was then undertaken with emphasis on their Tomonaga-Luttinger (TL) liquid properties. Near half-filling the exponents of the t-J model take the values of the strong-correlation limit of the Hubbard model, and in the low-density limit the exponents are those of a non-interacting system. The critical exponents obtained in this study belong to the repulsive TL liquid (conducting phase) and attractive TL liquid (superconducting phase). The theoretical results from this study find applications in one-dimensional organic conductors (TTF-TCNQ), organic superconductors (Bechgaard salts) and carbon nanotubes (SWCNTs, DWCNTs and MWCNTs). For instance, the critical exponent at from this study is consistent with the experimental result from optical and photoemission evidence of TL liquid in one-dimensional metallic Bechgaard salt- (TMTSF)2PF6.Keywords: critical exponents, conformal field theory, Hubbard model, t-J model
Procedia PDF Downloads 34331031 The Use of Learning Management Systems during Emerging the Tacit Knowledge
Authors: Ercan Eker, Muhammer Karaman, Akif Aslan, Hakan Tanrikuluoglu
Abstract:
Deficiency of institutional memory and knowledge management can result in information security breaches, loss of prestige and trustworthiness and the worst the loss of know-how and institutional knowledge. Traditional learning management within organizations is generally handled by personal efforts. That kind of struggle mostly depends on personal desire, motivation and institutional belonging. Even if an organization has highly motivated employees at a certain time, the institutional knowledge and memory life cycle will generally remain limited to these employees’ spending time in this organization. Having a learning management system in an organization can sustain the institutional memory, knowledge and know-how in the organization. Learning management systems are much more needed especially in public organizations where the job rotation is frequently seen and managers are appointed periodically. However, a learning management system should not be seen as an organizations’ website. It is a more comprehensive, interactive and user-friendly knowledge management tool for organizations. In this study, the importance of using learning management systems in the process of emerging tacit knowledge is underlined.Keywords: knowledge management, learning management systems, tacit knowledge, institutional memory
Procedia PDF Downloads 38031030 Method of Successive Approximations for Modeling of Distributed Systems
Authors: A. Torokhti
Abstract:
A new method of mathematical modeling of the distributed nonlinear system is developed. The system is represented by a combination of the set of spatially distributed sensors and the fusion center. Its mathematical model is obtained from the iterative procedure that converges to the model which is optimal in the sense of minimizing an associated cost function.Keywords: mathematical modeling, non-linear system, spatially distributed sensors, fusion center
Procedia PDF Downloads 38231029 Attractiveness of Cafeteria Systems as Viewed by Generation Z
Authors: Joanna Nieżurawska, Hanna Karaszewska, Anna Dziadkiewicz
Abstract:
Contemporary conditions force companies to constantly implement changes and improvements, which is connected with plasticization of their activity in all spheres. Cafeteria systems are a good example of flexible remuneration systems. Cafeteria systems are well-known and often used in the United States, Great Britain and in Western Europe. In Poland, they are hardly ever used and greater flexibility in remuneration packages refers mainly to senior managers and executives. The main aim of this article is to research the attractiveness of the cafeteria system as viewed by generation Z. The additional aim of the article is to prioritize using the importance index of particular types of cafeteria systems from the generation Z’s perspective, as well as to identify the factors which determine the development of cafeteria systems in Poland. The research was conducted in June 2015 among 185 young employees (generation Z). The paper presents some of the results.Keywords: cafeteria, generation X, generation Y, generation Z, flexible remuneration systems, plasticization of remuneration
Procedia PDF Downloads 40831028 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis
Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai
Abstract:
The purpose of this study is to forecast the influences of Information and Communication Technology (ICT) on the structural changes of Japanese economies based on Leontief Input-Output (IO) coefficients. This study establishes a statistical analysis to predict the future interrelationships among industries. We employ the Constrained Multivariate Regression (CMR) model to analyze the historical changes of input-output coefficients. Statistical significance of the model is then tested by Likelihood Ratio Test (LRT). In our model, ICT is represented by two explanatory variables, i.e. computers (including main parts and accessories) and telecommunications equipment. A previous study, which analyzed the influences of these variables on the structural changes of Japanese industrial sectors from 1985-2005, concluded that these variables had significant influences on the changes in the business circumstances of Japanese commerce, business services and office supplies, and personal services sectors. The projected future Japanese economic structure based on the above forecast generates the differentiated direct and indirect outcomes of ICT penetration.Keywords: forecast, ICT, industrial structural changes, statistical analysis
Procedia PDF Downloads 37531027 Study on Accurate Calculation Method of Model Attidude on Wind Tunnel Test
Authors: Jinjun Jiang, Lianzhong Chen, Rui Xu
Abstract:
The accurate of model attitude angel plays an important role on the aerodynamic test results in the wind tunnel test. The original method applies the spherical coordinate system transformation to obtain attitude angel calculation.The model attitude angel is obtained by coordinate transformation and spherical surface mapping applying the nominal attitude angel (the balance attitude angel in the wind tunnel coordinate system) indicated by the mechanism. First, the coordinate transformation of this method is not only complex but also difficult to establish the transformed relationship between the space coordinate systems especially after many steps of coordinate transformation, moreover it cannot realize the iterative calculation of the interference relationship between attitude angels; Second, during the calculate process to solve the problem the arc is approximately used to replace the straight line, the angel for the tangent value, and the inverse trigonometric function is applied. Therefore, in the calculation of attitude angel, the process is complex and inaccurate, which can be solved approximately when calculating small attack angel. However, with the advancing development of modern aerodynamic unsteady research, the aircraft tends to develop high or super large attack angel and unsteadyresearch field.According to engineering practice and vector theory, the concept of vector angel coordinate systemis proposed for the first time, and the vector angel coordinate system of attitude angel is established.With the iterative correction calculation and avoiding the problem of approximate and inverse trigonometric function solution, the model attitude calculation process is carried out in detail, which validates that the calculation accuracy and accuracy of model attitude angels are improved.Based on engineering and theoretical methods, a vector angel coordinate systemis established for the first time, which gives the transformation and angel definition relations between different flight attitude coordinate systems, that can accurately calculate the attitude angel of the corresponding coordinate systemand determine its direction, especially in the channel coupling calculation, the calculation of the attitude angel between the coordinate systems is only related to the angel, and has nothing to do with the change order s of the coordinate system, whichsimplifies the calculation process.Keywords: attitude angel, angel vector coordinate system, iterative calculation, spherical coordinate system, wind tunnel test
Procedia PDF Downloads 14631026 Political Views and Information and Communication Technology (ICT) in Tertiary Institutions in Achieving the Millennium Development Goals (MDGS)
Authors: Perpetual Nwakaego Ibe
Abstract:
The Millennium Development Goals (MDGs), were an integrated project formed to eradicate many unnatural situations the citizens of the third world country may found themselves in. The MDGs, to be a sustainable project for the future depends 100% on the actions of governments, multilateral institutions and civil society. This paper first looks at the political views on the MDGs and relates it to the current electoral situations around the country by underlining the drastic changes over the few months. The second part of the paper presents ICT in tertiary institutions as one of the solutions in terms of the success of the MDGs. ICT is vital in all phases of educational process and development of the cloud connectivity is an added advantage of Information and Communication Technology (ICT) for sharing a common data bank for research purposes among UNICEF, RED CROSS, NPS, INEC, NMIC, and WHO. Finally, the paper concludes with areas that needs twigging and recommendations for the tertiary institutions committed to delivering an ambitious set of goals. A combination of observation, and document materials for data gathering was employed as the methodology for carrying out this research.Keywords: MDG, ICT, data bank, database
Procedia PDF Downloads 20031025 Building Information Modelling: A Solution to the Limitations of Prefabricated Construction
Authors: Lucas Peries, Rolla Monib
Abstract:
The construction industry plays a vital role in the global economy, contributing billions of dollars annually. However, the industry has been struggling with persistently low productivity levels for years, unlike other sectors that have shown significant improvements. Modular and prefabricated construction methods have been identified as potential solutions to boost productivity in the construction industry. These methods offer time advantages over traditional construction methods. Despite their potential benefits, modular and prefabricated construction face hindrances and limitations that are not present in traditional building systems. Building information modelling (BIM) has the potential to address some of these hindrances, but barriers are preventing its widespread adoption in the construction industry. This research aims to enhance understanding of the shortcomings of modular and prefabricated building systems and develop BIM-based solutions to alleviate or eliminate these hindrances. The research objectives include identifying and analysing key issues hindering the use of modular and prefabricated building systems, investigating the current state of BIM adoption in the construction industry and factors affecting its successful implementation, proposing BIM-based solutions to address the issues associated with modular and prefabricated building systems, and assessing the effectiveness of the developed solutions in removing barriers to their use. The research methodology involves conducting a critical literature review to identify the key issues and challenges in modular and prefabricated construction and BIM adoption. Additionally, an online questionnaire will be used to collect primary data from construction industry professionals, allowing for feedback and evaluation of the proposed BIM-based solutions. The data collected will be analysed to evaluate the effectiveness of the solutions and their potential impact on the adoption of modular and prefabricated building systems. The main findings of the research indicate that the identified issues from the literature review align with the opinions of industry professionals, and the proposed BIM-based solutions are considered effective in addressing the challenges associated with modular and prefabricated construction. However, the research has limitations, such as a small sample size and the need to assess the feasibility of implementing the proposed solutions. In conclusion, this research contributes to enhancing the understanding of modular and prefabricated building systems' limitations and proposes BIM-based solutions to overcome these limitations. The findings are valuable to construction industry professionals and BIM software developers, providing insights into the challenges and potential solutions for implementing modular and prefabricated construction systems in future projects. Further research should focus on addressing the limitations and assessing the feasibility of implementing the proposed solutions from technical and legal perspectives.Keywords: building information modelling, modularisation, prefabrication, technology
Procedia PDF Downloads 9831024 Measuring the Embodied Energy of Construction Materials and Their Associated Cost Through Building Information Modelling
Authors: Ahmad Odeh, Ahmad Jrade
Abstract:
Energy assessment is an evidently significant factor when evaluating the sustainability of structures especially at the early design stage. Today design practices revolve around the selection of material that reduces the operational energy and yet meets their displinary need. Operational energy represents a substantial part of the building lifecycle energy usage but the fact remains that embodied energy is an important aspect unaccounted for in the carbon footprint. At the moment, little or no consideration is given to embodied energy mainly due to the complexity of calculation and the various factors involved. The equipment used, the fuel needed, and electricity required for each material vary with location and thus the embodied energy will differ for each project. Moreover, the method and the technique used in manufacturing, transporting and putting in place will have a significant influence on the materials’ embodied energy. This anomaly has made it difficult to calculate or even bench mark the usage of such energies. This paper presents a model aimed at helping designers select the construction materials based on their embodied energy. Moreover, this paper presents a systematic approach that uses an efficient method of calculation and ultimately provides new insight into construction material selection. The model is developed in a BIM environment targeting the quantification of embodied energy for construction materials through the three main stages of their life: manufacturing, transportation and placement. The model contains three major databases each of which contains a set of the most commonly used construction materials. The first dataset holds information about the energy required to manufacture any type of materials, the second includes information about the energy required for transporting the materials while the third stores information about the energy required by tools and cranes needed to place an item in its intended location. The model provides designers with sets of all available construction materials and their associated embodied energies to use for the selection during the design process. Through geospatial data and dimensional material analysis, the model will also be able to automatically calculate the distance between the factories and the construction site. To remain within the sustainability criteria set by LEED, a final database is created and used to calculate the overall construction cost based on R.M.S. means cost data and then automatically recalculate the costs for any modifications. Design criteria including both operational and embodied energies will cause designers to revaluate the current material selection for cost, energy, and most importantly sustainability.Keywords: building information modelling, energy, life cycle analysis, sustainablity
Procedia PDF Downloads 26931023 3D Object Retrieval Based on Similarity Calculation in 3D Computer Aided Design Systems
Authors: Ahmed Fradi
Abstract:
Nowadays, recent technological advances in the acquisition, modeling, and processing of three-dimensional (3D) objects data lead to the creation of models stored in huge databases, which are used in various domains such as computer vision, augmented reality, game industry, medicine, CAD (Computer-aided design), 3D printing etc. On the other hand, the industry is currently benefiting from powerful modeling tools enabling designers to easily and quickly produce 3D models. The great ease of acquisition and modeling of 3D objects make possible to create large 3D models databases, then, it becomes difficult to navigate them. Therefore, the indexing of 3D objects appears as a necessary and promising solution to manage this type of data, to extract model information, retrieve an existing model or calculate similarity between 3D objects. The objective of the proposed research is to develop a framework allowing easy and fast access to 3D objects in a CAD models database with specific indexing algorithm to find objects similar to a reference model. Our main objectives are to study existing methods of similarity calculation of 3D objects (essentially shape-based methods) by specifying the characteristics of each method as well as the difference between them, and then we will propose a new approach for indexing and comparing 3D models, which is suitable for our case study and which is based on some previously studied methods. Our proposed approach is finally illustrated by an implementation, and evaluated in a professional context.Keywords: CAD, 3D object retrieval, shape based retrieval, similarity calculation
Procedia PDF Downloads 26231022 A Method for Reduction of Association Rules in Data Mining
Authors: Diego De Castro Rodrigues, Marcelo Lisboa Rocha, Daniela M. De Q. Trevisan, Marcos Dias Da Conceicao, Gabriel Rosa, Rommel M. Barbosa
Abstract:
The use of association rules algorithms within data mining is recognized as being of great value in the knowledge discovery in databases. Very often, the number of rules generated is high, sometimes even in databases with small volume, so the success in the analysis of results can be hampered by this quantity. The purpose of this research is to present a method for reducing the quantity of rules generated with association algorithms. Therefore, a computational algorithm was developed with the use of a Weka Application Programming Interface, which allows the execution of the method on different types of databases. After the development, tests were carried out on three types of databases: synthetic, model, and real. Efficient results were obtained in reducing the number of rules, where the worst case presented a gain of more than 50%, considering the concepts of support, confidence, and lift as measures. This study concluded that the proposed model is feasible and quite interesting, contributing to the analysis of the results of association rules generated from the use of algorithms.Keywords: data mining, association rules, rules reduction, artificial intelligence
Procedia PDF Downloads 161