Search results for: predictive models
5407 Emulation of a Wind Turbine Using Induction Motor Driven by Field Oriented Control
Authors: L. Benaaouinate, M. Khafallah, A. Martinez, A. Mesbahi, T. Bouragba
Abstract:
This paper concerns with the modeling, simulation, and emulation of a wind turbine emulator for standalone wind energy conversion systems. By using emulation system, we aim to reproduce the dynamic behavior of the wind turbine torque on the generator shaft: it provides the testing facilities to optimize generator control strategies in a controlled environment, without reliance on natural resources. The aerodynamic, mechanical, electrical models have been detailed as well as the control of pitch angle using Fuzzy Logic for horizontal axis wind turbines. The wind turbine emulator consists mainly of an induction motor with AC power drive with torque control. The control of the induction motor and the mathematical models of the wind turbine are designed with MATLAB/Simulink environment. The simulation results confirm the effectiveness of the induction motor control system and the functionality of the wind turbine emulator for providing all necessary parameters of the wind turbine system such as wind speed, output torque, power coefficient and tip speed ratio. The findings are of direct practical relevance.Keywords: electrical generator, induction motor drive, modeling, pitch angle control, real time control, renewable energy, wind turbine, wind turbine emulator
Procedia PDF Downloads 2375406 Service Business Model Canvas: A Boundary Object Operating as a Business Development Tool
Authors: Taru Hakanen, Mervi Murtonen
Abstract:
This study aims to increase understanding of the transition of business models in servitization. The significance of service in all business has increased dramatically during the past decades. Service-dominant logic (SDL) describes this change in the economy and questions the goods-dominant logic on which business has primarily been based in the past. A business model canvas is one of the most cited and used tools in defining end developing business models. The starting point of this paper lies in the notion that the traditional business model canvas is inherently goods-oriented and best suits for product-based business. However, the basic differences between goods and services necessitate changes in business model representations when proceeding in servitization. Therefore, new knowledge is needed on how the conception of business model and the business model canvas as its representation should be altered in servitized firms in order to better serve business developers and inter-firm co-creation. That is to say, compared to products, services are intangible and they are co-produced between the supplier and the customer. Value is always co-created in interaction between a supplier and a customer, and customer experience primarily depends on how well the interaction succeeds between the actors. The role of service experience is even stronger in service business compared to product business, as services are co-produced with the customer. This paper provides business model developers with a service business model canvas, which takes into account the intangible, interactive, and relational nature of service. The study employs a design science approach that contributes to theory development via design artifacts. This study utilizes qualitative data gathered in workshops with ten companies from various industries. In particular, key differences between Goods-dominant logic (GDL) and SDL-based business models are identified when an industrial firm proceeds in servitization. As the result of the study, an updated version of the business model canvas is provided based on service-dominant logic. The service business model canvas ensures a stronger customer focus and includes aspects salient for services, such as interaction between companies, service co-production, and customer experience. It can be used for the analysis and development of a current service business model of a company or for designing a new business model. It facilitates customer-focused new service design and service development. It aids in the identification of development needs, and facilitates the creation of a common view of the business model. Therefore, the service business model canvas can be regarded as a boundary object, which facilitates the creation of a common understanding of the business model between several actors involved. The study contributes to the business model and service business development disciplines by providing a managerial tool for practitioners in service development. It also provides research insight into how servitization challenges companies’ business models.Keywords: boundary object, business model canvas, managerial tool, service-dominant logic
Procedia PDF Downloads 3725405 Mathematical Modeling of Thin Layer Drying Behavior of Bhimkol (Musa balbisiana) Pulp
Authors: Ritesh Watharkar, Sourabh Chakraborty, Brijesh Srivastava
Abstract:
Reduction of water from the fruits and vegetables using different drying techniques is widely employed to prolong the shelf life of these food commodities. Heat transfer occurs inside the sample by conduction and mass transfer takes place by diffusion in accordance with temperature and moisture concentration gradient respectively during drying. This study was undertaken to study and model the thin layer drying behavior of Bhimkol pulp. The drying was conducted in a tray drier at 500c temperature with 5, 10 and 15 % concentrations of added maltodextrin. The drying experiments were performed at 5mm thickness of the thin layer and the constant air velocity of 0.5 m/s.Drying data were fitted to different thin layer drying models found in the literature. Comparison of fitted models was based on highest R2(0.9917), lowest RMSE (0.03201), and lowest SSE (0.01537) revealed Middle equation as the best-fitted model for thin layer drying with 10% concentration of maltodextrin. The effective diffusivity was estimated based on the solution of Fick’s law of diffusion which is found in the range of 3.0396 x10-09 to 5.0661 x 10-09. There was a reduction in drying time with the addition of maltodextrin as compare to the raw pulp.Keywords: Bhimkol, diffusivity, maltodextrine, Midilli model
Procedia PDF Downloads 2155404 A Conv-Long Short-term Memory Deep Learning Model for Traffic Flow Prediction
Authors: Ali Reza Sattarzadeh, Ronny J. Kutadinata, Pubudu N. Pathirana, Van Thanh Huynh
Abstract:
Traffic congestion has become a severe worldwide problem, affecting everyday life, fuel consumption, time, and air pollution. The primary causes of these issues are inadequate transportation infrastructure, poor traffic signal management, and rising population. Traffic flow forecasting is one of the essential and effective methods in urban congestion and traffic management, which has attracted the attention of researchers. With the development of technology, undeniable progress has been achieved in existing methods. However, there is a possibility of improvement in the extraction of temporal and spatial features to determine the importance of traffic flow sequences and extraction features. In the proposed model, we implement the convolutional neural network (CNN) and long short-term memory (LSTM) deep learning models for mining nonlinear correlations and their effectiveness in increasing the accuracy of traffic flow prediction in the real dataset. According to the experiments, the results indicate that implementing Conv-LSTM networks increases the productivity and accuracy of deep learning models for traffic flow prediction.Keywords: deep learning algorithms, intelligent transportation systems, spatiotemporal features, traffic flow prediction
Procedia PDF Downloads 1785403 Identifying and Quantifying Factors Affecting Traffic Crash Severity under Heterogeneous Traffic Flow
Authors: Praveen Vayalamkuzhi, Veeraragavan Amirthalingam
Abstract:
Studies on safety on highways are becoming the need of the hour as over 400 lives are lost every day in India due to road crashes. In order to evaluate the factors that lead to different levels of crash severity, it is necessary to investigate the level of safety of highways and their relation to crashes. In the present study, an attempt is made to identify the factors that contribute to road crashes and to quantify their effect on the severity of road crashes. The study was carried out on a four-lane divided rural highway in India. The variables considered in the analysis includes components of horizontal alignment of highway, viz., straight or curve section; time of day, driveway density, presence of median; median opening; gradient; operating speed; and annual average daily traffic. These variables were considered after a preliminary analysis. The major complexities in the study are the heterogeneous traffic and the speed variation between different classes of vehicles along the highway. To quantify the impact of each of these factors, statistical analyses were carried out using Logit model and also negative binomial regression. The output from the statistical models proved that the variables viz., horizontal components of the highway alignment; driveway density; time of day; operating speed as well as annual average daily traffic show significant relation with the severity of crashes viz., fatal as well as injury crashes. Further, the annual average daily traffic has significant effect on the severity compared to other variables. The contribution of highway horizontal components on crash severity is also significant. Logit models can predict crashes better than the negative binomial regression models. The results of the study will help the transport planners to look into these aspects at the planning stage itself in the case of highways operated under heterogeneous traffic flow condition.Keywords: geometric design, heterogeneous traffic, road crash, statistical analysis, level of safety
Procedia PDF Downloads 3085402 The Advancements of Transformer Models in Part-of-Speech Tagging System for Low-Resource Tigrinya Language
Authors: Shamm Kidane, Ibrahim Abdella, Fitsum Gaim, Simon Mulugeta, Sirak Asmerom, Natnael Ambasager, Yoel Ghebrihiwot
Abstract:
The call for natural language processing (NLP) systems for low-resource languages has become more apparent than ever in the past few years, with the arduous challenges still present in preparing such systems. This paper presents an improved dataset version of the Nagaoka Tigrinya Corpus for Parts-of-Speech (POS) classification system in the Tigrinya language. The size of the initial Nagaoka dataset was incremented, totaling the new tagged corpus to 118K tokens, which comprised the 12 basic POS annotations used previously. The additional content was also annotated manually in a stringent manner, followed similar rules to the former dataset and was formatted in CONLL format. The system made use of the novel approach in NLP tasks and use of the monolingually pre-trained TiELECTRA, TiBERT and TiRoBERTa transformer models. The highest achieved score is an impressive weighted F1-score of 94.2%, which surpassed the previous systems by a significant measure. The system will prove useful in the progress of NLP-related tasks for Tigrinya and similarly related low-resource languages with room for cross-referencing higher-resource languages.Keywords: Tigrinya POS corpus, TiBERT, TiRoBERTa, conditional random fields
Procedia PDF Downloads 1085401 Landslide Hazard Assessment Using Physically Based Mathematical Models in Agricultural Terraces at Douro Valley in North of Portugal
Authors: C. Bateira, J. Fernandes, A. Costa
Abstract:
The Douro Demarked Region (DDR) is a production Porto wine region. On the NE of Portugal, the strong incision of the Douro valley developed very steep slopes, organized with agriculture terraces, have experienced an intense and deep transformation in order to implement the mechanization of the work. The old terrace system, based on stone vertical wall support structure, replaced by terraces with earth embankments experienced a huge terrace instability. This terrace instability has important economic and financial consequences on the agriculture enterprises. This paper presents and develops cartographic tools to access the embankment instability and identify the area prone to instability. The priority on this evaluation is related to the use of physically based mathematical models and develop a validation process based on an inventory of the past embankment instability. We used the shallow landslide stability model (SHALSTAB) based on physical parameters such us cohesion (c’), friction angle(ф), hydraulic conductivity, soil depth, soil specific weight (ϱ), slope angle (α) and contributing areas by Multiple Flow Direction Method (MFD). A terraced area can be analysed by this models unless we have very detailed information representative of the terrain morphology. The slope angle and the contributing areas depend on that. We can achieve that propose using digital elevation models (DEM) with great resolution (pixel with 40cm side), resulting from a set of photographs taken by a flight at 100m high with pixel resolution of 12cm. The slope angle results from this DEM. In the other hand, the MFD contributing area models the internal flow and is an important element to define the spatial variation of the soil saturation. That internal flow is based on the DEM. That is supported by the statement that the interflow, although not coincident with the superficial flow, have important similitude with it. Electrical resistivity monitoring values which related with the MFD contributing areas build from a DEM of 1m resolution and revealed a consistent correlation. That analysis, performed on the area, showed a good correlation with R2 of 0,72 and 0,76 at 1,5m and 2m depth, respectively. Considering that, a DEM with 1m resolution was the base to model the real internal flow. Thus, we assumed that the contributing area of 1m resolution modelled by MFD is representative of the internal flow of the area. In order to solve this problem we used a set of generalized DEMs to build the contributing areas used in the SHALSTAB. Those DEMs, with several resolutions (1m and 5m), were built from a set of photographs with 50cm resolution taken by a flight with 5km high. Using this maps combination, we modelled several final maps of terrace instability and performed a validation process with the contingency matrix. The best final instability map resembles the slope map from a DEM of 40cm resolution and a MFD map from a DEM of 1m resolution with a True Positive Rate (TPR) of 0,97, a False Positive Rate of 0,47, Accuracy (ACC) of 0,53, Precision (PVC) of 0,0004 and a TPR/FPR ratio of 2,06.Keywords: agricultural terraces, cartography, landslides, SHALSTAB, vineyards
Procedia PDF Downloads 1815400 A Business Model Design Process for Social Enterprises: The Critical Role of the Environment
Authors: Hadia Abdel Aziz, Raghda El Ebrashi
Abstract:
Business models are shaped by their design space or the environment they are designed to be implemented in. The rapidly changing economic, technological, political, regulatory and market external environment severely affects business logic. This is particularly true for social enterprises whose core mission is to transform their environments, and thus, their whole business logic revolves around the interchange between the enterprise and the environment. The context in which social business operates imposes different business design constraints while at the same time, open up new design opportunities. It is also affected to a great extent by the impact that successful enterprises generate; a continuous loop of interaction that needs to be managed through a dynamic capability in order to generate a lasting powerful impact. This conceptual research synthesizes and analyzes literature on social enterprise, social enterprise business models, business model innovation, business model design, and the open system view theory to propose a new business model design process for social enterprises that takes into account the critical role of environmental factors. This process would help the social enterprise develop a dynamic capability that ensures the alignment of its business model to its environmental context, thus, maximizing its probability of success.Keywords: social enterprise, business model, business model design, business model environment
Procedia PDF Downloads 3775399 Artificial Neural Network Regression Modelling of GC/MS Retention of Terpenes Present in Satureja montana Extracts Obtained by Supercritical Carbon Dioxide
Authors: Strahinja Kovačević, Jelena Vladić, Senka Vidović, Zoran Zeković, Lidija Jevrić, Sanja Podunavac Kuzmanović
Abstract:
Supercritical extracts of highly valuated medicinal plant Satureja montana were prepared by application of supercritical carbon dioxide extraction in the carbon dioxide pressure range from 125 to 350 bar and temperature range from 40 to 60°C. Using GC/MS method of analysis chemical profiles (aromatic constituents) of S. montana extracts were obtained. Self-training artificial neural networks were applied to predict the retention time of the analyzed terpenes in GC/MS system. The best ANN model obtained was multilayer perceptron (MLP 11-11-1). Hidden activation was tanh and output activation was identity with Broyden–Fletcher–Goldfarb–Shanno training algorithm. Correlation measures of the obtained network were the following: R(training) = 0.9975, R(test) = 0.9971 and R(validation) = 0.9999. The comparison of the experimental and predicted retention times of the analyzed compounds showed very high correlation (R = 0.9913) and significant predictive power of the established neural network.Keywords: ANN regression, GC/MS, Satureja montana, terpenes
Procedia PDF Downloads 4575398 Perceptions of Educators on the Learners’ Youngest Age for the Introduction of ICTs in Schools: A Personality Theory Approach
Authors: Kayode E. Oyetade, Seraphin D. Eyono Obono
Abstract:
Age ratings are very helpful in providing parents with relevant information for the purchase and use of digital technologies by the children; this is why the non-definition of age ratings for the use of ICT's by children in schools is a major concern; and this problem serves as a motivation for this study whose aim is to examine the factors affecting the perceptions of educators on the learners’ youngest age for the introduction of ICT's in schools. This aim is achieved through two types of research objectives: the identification and design of theories and models on age ratings, and the empirical testing of such theories and models in a survey of educators from the Camperdown district of the South African KwaZulu-Natal province. A questionnaire is used for the collection of the data of this survey whose validity and reliability is checked in SPSS prior to its descriptive and correlative quantitative analysis. The main hypothesis supporting this research is the association between the demographics of educators, their personality, and their perceptions on the learners’ youngest age for the introduction of ICT's in schools; as claimed by existing research; except that the present study looks at personality from three dimensions: self-actualized personalities, fully functioning personalities, and healthy personalities. This hypothesis was fully confirmed by the empirical study conducted by this research except for the demographic factor where only the educators’ grade or class was found to be associated with the personality of educators.Keywords: age ratings, educators, e-learning, personality theories
Procedia PDF Downloads 2415397 Optimization of Element Type for FE Model and Verification of Analyses with Physical Tests
Authors: Mustafa Tufekci, Caner Guven
Abstract:
In Automotive Industry, sliding door systems that are also used as body closures, are safety members. Extreme product tests are realized to prevent failures in a design process, but these tests realized experimentally result in high costs. Finite element analysis is an effective tool used for the design process. These analyses are used before production of a prototype for validation of design according to customer requirement. In result of this, the substantial amount of time and cost is saved. Finite element model is created for geometries that are designed in 3D CAD programs. Different element types as bar, shell and solid, can be used for creating mesh model. The cheaper model can be created by the selection of element type, but combination of element type that was used in model, number and geometry of element and degrees of freedom affects the analysis result. Sliding door system is a good example which used these methods for this study. Structural analysis was realized for sliding door mechanism by using FE models. As well, physical tests that have same boundary conditions with FE models were realized. Comparison study for these element types, were done regarding test and analyses results then the optimum combination was achieved.Keywords: finite element analysis, sliding door mechanism, element type, structural analysis
Procedia PDF Downloads 3325396 Artificial Neural Networks and Hidden Markov Model in Landslides Prediction
Authors: C. S. Subhashini, H. L. Premaratne
Abstract:
Landslides are the most recurrent and prominent disaster in Sri Lanka. Sri Lanka has been subjected to a number of extreme landslide disasters that resulted in a significant loss of life, material damage, and distress. It is required to explore a solution towards preparedness and mitigation to reduce recurrent losses associated with landslides. Artificial Neural Networks (ANNs) and Hidden Markov Model (HMMs) are now widely used in many computer applications spanning multiple domains. This research examines the effectiveness of using Artificial Neural Networks and Hidden Markov Model in landslides predictions and the possibility of applying the modern technology to predict landslides in a prominent geographical area in Sri Lanka. A thorough survey was conducted with the participation of resource persons from several national universities in Sri Lanka to identify and rank the influencing factors for landslides. A landslide database was created using existing topographic; soil, drainage, land cover maps and historical data. The landslide related factors which include external factors (Rainfall and Number of Previous Occurrences) and internal factors (Soil Material, Geology, Land Use, Curvature, Soil Texture, Slope, Aspect, Soil Drainage, and Soil Effective Thickness) are extracted from the landslide database. These factors are used to recognize the possibility to occur landslides by using an ANN and HMM. The model acquires the relationship between the factors of landslide and its hazard index during the training session. These models with landslide related factors as the inputs will be trained to predict three classes namely, ‘landslide occurs’, ‘landslide does not occur’ and ‘landslide likely to occur’. Once trained, the models will be able to predict the most likely class for the prevailing data. Finally compared two models with regards to prediction accuracy, False Acceptance Rates and False Rejection rates and This research indicates that the Artificial Neural Network could be used as a strong decision support system to predict landslides efficiently and effectively than Hidden Markov Model.Keywords: landslides, influencing factors, neural network model, hidden markov model
Procedia PDF Downloads 3875395 A Distinct Method Based on Mamba-Unet for Brain Tumor Image Segmentation
Authors: Djallel Bouamama, Yasser R. Haddadi
Abstract:
Accurate brain tumor segmentation is crucial for diagnosis and treatment planning, yet it remains a challenging task due to the variability in tumor shapes and intensities. This paper introduces a distinct approach to brain tumor image segmentation by leveraging an advanced architecture known as Mamba-Unet. Building on the well-established U-Net framework, Mamba-Unet incorporates distinct design enhancements to improve segmentation performance. Our proposed method integrates a multi-scale attention mechanism and a hybrid loss function to effectively capture fine-grained details and contextual information in brain MRI scans. We demonstrate that Mamba-Unet significantly enhances segmentation accuracy compared to conventional U-Net models by utilizing a comprehensive dataset of annotated brain MRI scans. Quantitative evaluations reveal that Mamba-Unet surpasses traditional U-Net architectures and other contemporary segmentation models regarding Dice coefficient, sensitivity, and specificity. The improvements are attributed to the method's ability to manage class imbalance better and resolve complex tumor boundaries. This work advances the state-of-the-art in brain tumor segmentation and holds promise for improving clinical workflows and patient outcomes through more precise and reliable tumor detection.Keywords: brain tumor classification, image segmentation, CNN, U-NET
Procedia PDF Downloads 475394 mKDNAD: A Network Flow Anomaly Detection Method Based On Multi-teacher Knowledge Distillation
Abstract:
Anomaly detection models for network flow based on machine learning have poor detection performance under extremely unbalanced training data conditions and also have slow detection speed and large resource consumption when deploying on network edge devices. Embedding multi-teacher knowledge distillation (mKD) in anomaly detection can transfer knowledge from multiple teacher models to a single model. Inspired by this, we proposed a state-of-the-art model, mKDNAD, to improve detection performance. mKDNAD mine and integrate the knowledge of one-dimensional sequence and two-dimensional image implicit in network flow to improve the detection accuracy of small sample classes. The multi-teacher knowledge distillation method guides the train of the student model, thus speeding up the model's detection speed and reducing the number of model parameters. Experiments in the CICIDS2017 dataset verify the improvements of our method in the detection speed and the detection accuracy in dealing with the small sample classes.Keywords: network flow anomaly detection (NAD), multi-teacher knowledge distillation, machine learning, deep learning
Procedia PDF Downloads 1285393 Behavior Factors Evaluation for Reinforced Concrete Structures
Authors: Muhammad Rizwan, Naveed Ahmad, Akhtar Naeem Khan
Abstract:
Seismic behavior factors are evaluated for the performance assessment of low rise reinforced concrete RC frame structures based on experimental study of unidirectional dynamic shake table testing of two 1/3rd reduced scaled two storey frames, with a code confirming special moment resisting frame (SMRF) model and a noncompliant model of similar characteristics but built in low strength concrete .The models were subjected to a scaled accelerogram record of 1994 Northridge earthquake to deformed the test models to final collapse stage in order to obtain the structural response parameters. The fully compliant model was observed with more stable beam-sway response, experiencing beam flexure yielding and ground-storey column base yielding upon subjecting to 100% of the record. The response modification factor - R factor obtained for the code complaint and deficient prototype structures were 7.5 and 4.5 respectively, which is about 10% and 40% less than the UBC-97 specified value for special moment resisting reinforced concrete frame structures.Keywords: Northridge 1994 earthquake, reinforced concrete frame, response modification factor, shake table testing
Procedia PDF Downloads 1775392 Determination of Inflow Performance Relationship for Naturally Fractured Reservoirs: Numerical Simulation Study
Authors: Melissa Ramirez, Mohammad Awal
Abstract:
The Inflow Performance Relationship (IPR) of a well is a relation between the oil production rate and flowing bottom-hole pressure. This relationship is an important tool for petroleum engineers to understand and predict the well performance. In the petroleum industry, IPR correlations are used to design and evaluate well completion, optimizing well production, and designing artificial lift. The most commonly used IPR correlations models are Vogel and Wiggins, these models are applicable to homogeneous and isotropic reservoir data. In this work, a new IPR model is developed to determine inflow performance relationship of oil wells in a naturally fracture reservoir. A 3D black-oil reservoir simulator is used to develop the oil mobility function for the studied reservoir. Based on simulation runs, four flow rates are run to record the oil saturation and calculate the relative permeability for a naturally fractured reservoir. The new method uses the result of a well test analysis along with permeability and pressure-volume-temperature data in the fluid flow equations to obtain the oil mobility function. Comparisons between the new method and two popular correlations for non-fractured reservoirs indicate the necessity for developing and using an IPR correlation specifically developed for a fractured reservoir.Keywords: inflow performance relationship, mobility function, naturally fractured reservoir, well test analysis
Procedia PDF Downloads 2895391 Smart Services for Easy and Retrofittable Machine Data Collection
Authors: Till Gramberg, Erwin Gross, Christoph Birenbaum
Abstract:
This paper presents the approach of the Easy2IoT research project. Easy2IoT aims to enable companies in the prefabrication sheet metal and sheet metal processing industry to enter the Industrial Internet of Things (IIoT) with a low-threshold and cost-effective approach. It focuses on the development of physical hardware and software to easily capture machine activities from on a sawing machine, benefiting various stakeholders in the SME value chain, including machine operators, tool manufacturers and service providers. The methodological approach of Easy2IoT includes an in-depth requirements analysis and customer interviews with stakeholders along the value chain. Based on these insights, actions, requirements and potential solutions for smart services are derived. The focus is on providing actionable recommendations, competencies and easy integration through no-/low-code applications to facilitate implementation and connectivity within production networks. At the core of the project is a novel, non-invasive measurement and analysis system that can be easily deployed and made IIoT-ready. This system collects machine data without interfering with the machines themselves. It does this by non-invasively measuring the tension on a sawing machine. The collected data is then connected and analyzed using artificial intelligence (AI) to provide smart services through a platform-based application. Three Smart Services are being developed within Easy2IoT to provide immediate benefits to users: Wear part and product material condition monitoring and predictive maintenance for sawing processes. The non-invasive measurement system enables the monitoring of tool wear, such as saw blades, and the quality of consumables and materials. Service providers and machine operators can use this data to optimize maintenance and reduce downtime and material waste. Optimize Overall Equipment Effectiveness (OEE) by monitoring machine activity. The non-invasive system tracks machining times, setup times and downtime to identify opportunities for OEE improvement and reduce unplanned machine downtime. Estimate CO2 emissions for connected machines. CO2 emissions are calculated for the entire life of the machine and for individual production steps based on captured power consumption data. This information supports energy management and product development decisions. The key to Easy2IoT is its modular and easy-to-use design. The non-invasive measurement system is universally applicable and does not require specialized knowledge to install. The platform application allows easy integration of various smart services and provides a self-service portal for activation and management. Innovative business models will also be developed to promote the sustainable use of the collected machine activity data. The project addresses the digitalization gap between large enterprises and SME. Easy2IoT provides SME with a concrete toolkit for IIoT adoption, facilitating the digital transformation of smaller companies, e.g. through retrofitting of existing machines.Keywords: smart services, IIoT, IIoT-platform, industrie 4.0, big data
Procedia PDF Downloads 795390 The Analysis of Space Syntax Used in the Development Explore of Hangzhou city’s Centratity
Authors: Liu Junzhu
Abstract:
In contemporary China,city is expanding with an amazing speed. And because of the unexpected events’ interference, spatial structure could change itself in a short time, That will lead to the new urban district livingness and unfortunately, this phenomenon is very common.On the one hand,it fail to achieve the goal of city planning, On the other hand,it is unfavourable to the sustainable development of city. Bill Hillier’stheory Space Syntax shows organzation pattern of each space,it explains the characteristics of urban spatial patterns and its transformation regulation from the point of self-organization in system and also, it gives confirmatory and predictive ways to the building and city. This paper used axial model to summarize Hangzhou City’s special structure and enhanced comprehensive understanding of macroscopic space and environment, space structure,developing trend, ect, by computer analysis of Space Syntax. From that, it helps us to know the operation law in the urban system and to understand Hangzhou City’s spatial pattern and indirect social effect it has mad more clearly, Thus, it could comply with the tendency of cities development in process and planning of policy and plan our cities’ future sustainably.Keywords: sustainable urban design, space syntax, spatial network, segment angular analysis, social inclusion
Procedia PDF Downloads 4675389 Approach for the Mathematical Calculation of the Damping Factor of Railway Bridges with Ballasted Track
Authors: Andreas Stollwitzer, Lara Bettinelli, Josef Fink
Abstract:
The expansion of the high-speed rail network over the past decades has resulted in new challenges for engineers, including traffic-induced resonance vibrations of railway bridges. Excessive resonance-induced speed-dependent accelerations of railway bridges during high-speed traffic can lead to negative consequences such as fatigue symptoms, distortion of the track, destabilisation of the ballast bed, and potentially even derailment. A realistic prognosis of bridge vibrations during high-speed traffic must not only rely on the right choice of an adequate calculation model for both bridge and train but first and foremost on the use of dynamic model parameters which reflect reality appropriately. However, comparisons between measured and calculated bridge vibrations are often characterised by considerable discrepancies, whereas dynamic calculations overestimate the actual responses and therefore lead to uneconomical results. This gap between measurement and calculation constitutes a complex research issue and can be traced to several causes. One major cause is found in the dynamic properties of the ballasted track, more specifically in the persisting, substantial uncertainties regarding the consideration of the ballasted track (mechanical model and input parameters) in dynamic calculations. Furthermore, the discrepancy is particularly pronounced concerning the damping values of the bridge, as conservative values have to be used in the calculations due to normative specifications and lack of knowledge. By using a large-scale test facility, the analysis of the dynamic behaviour of ballasted track has been a major research topic at the Institute of Structural Engineering/Steel Construction at TU Wien in recent years. This highly specialised test facility is designed for isolated research of the ballasted track's dynamic stiffness and damping properties – independent of the bearing structure. Several mechanical models for the ballasted track consisting of one or more continuous spring-damper elements were developed based on the knowledge gained. These mechanical models can subsequently be integrated into bridge models for dynamic calculations. Furthermore, based on measurements at the test facility, model-dependent stiffness and damping parameters were determined for these mechanical models. As a result, realistic mechanical models of the railway bridge with different levels of detail and sufficiently precise characteristic values are available for bridge engineers. Besides that, this contribution also presents another practical application of such a bridge model: Based on the bridge model, determination equations for the damping factor (as Lehr's damping factor) can be derived. This approach constitutes a first-time method that makes the damping factor of a railway bridge calculable. A comparison of this mathematical approach with measured dynamic parameters of existing railway bridges illustrates, on the one hand, the apparent deviation between normatively prescribed and in-situ measured damping factors. On the other hand, it is also shown that a new approach, which makes it possible to calculate the damping factor, provides results that are close to reality and thus raises potentials for minimising the discrepancy between measurement and calculation.Keywords: ballasted track, bridge dynamics, damping, model design, railway bridges
Procedia PDF Downloads 1675388 Effects of Machining Parameters on the Surface Roughness and Vibration of the Milling Tool
Authors: Yung C. Lin, Kung D. Wu, Wei C. Shih, Jui P. Hung
Abstract:
High speed and high precision machining have become the most important technology in manufacturing industry. The surface roughness of high precision components is regarded as the important characteristics of the product quality. However, machining chatter could damage the machined surface and restricts the process efficiency. Therefore, selection of the appropriate cutting conditions is of importance to prevent the occurrence of chatter. In addition, vibration of the spindle tool also affects the surface quality, which implies the surface precision can be controlled by monitoring the vibration of the spindle tool. Based on this concept, this study was aimed to investigate the influence of the machining conditions on the surface roughness and the vibration of the spindle tool. To this end, a series of machining tests were conducted on aluminum alloy. In tests, the vibration of the spindle tool was measured by using the acceleration sensors. The surface roughness of the machined parts was examined using white light interferometer. The response surface methodology (RSM) was employed to establish the mathematical models for predicting surface finish and tool vibration, respectively. The correlation between the surface roughness and spindle tool vibration was also analyzed by ANOVA analysis. According to the machining tests, machined surface with or without chattering was marked on the lobes diagram as the verification of the machining conditions. Using multivariable regression analysis, the mathematical models for predicting the surface roughness and tool vibrations were developed based on the machining parameters, cutting depth (a), feed rate (f) and spindle speed (s). The predicted roughness is shown to agree well with the measured roughness, an average percentage of errors of 10%. The average percentage of errors of the tool vibrations between the measurements and the predictions of mathematical model is about 7.39%. In addition, the tool vibration under various machining conditions has been found to have a positive influence on the surface roughness (r=0.78). As a conclusion from current results, the mathematical models were successfully developed for the predictions of the surface roughness and vibration level of the spindle tool under different cutting condition, which can help to select appropriate cutting parameters and to monitor the machining conditions to achieve high surface quality in milling operation.Keywords: machining parameters, machining stability, regression analysis, surface roughness
Procedia PDF Downloads 2355387 Simulation of Scaled Model of Tall Multistory Structure: Raft Foundation for Experimental and Numerical Dynamic Studies
Authors: Omar Qaftan
Abstract:
Earthquakes can cause tremendous loss of human life and can result in severe damage to a several of civil engineering structures especially the tall buildings. The response of a multistory structure subjected to earthquake loading is a complex task, and it requires to be studied by physical and numerical modelling. For many circumstances, the scale models on shaking table may be a more economical option than the similar full-scale tests. A shaking table apparatus is a powerful tool that offers a possibility of understanding the actual behaviour of structural systems under earthquake loading. It is required to use a set of scaling relations to predict the behaviour of the full-scale structure. Selecting the scale factors is the most important steps in the simulation of the prototype into the scaled model. In this paper, the principles of scaling modelling procedure are explained in details, and the simulation of scaled multi-storey concrete structure for dynamic studies is investigated. A procedure for a complete dynamic simulation analysis is investigated experimentally and numerically with a scale factor of 1/50. The frequency domain accounting and lateral displacement for both numerical and experimental scaled models are determined. The procedure allows accounting for the actual dynamic behave of actual size porotype structure and scaled model. The procedure is adapted to determine the effects of the tall multi-storey structure on a raft foundation. Four generated accelerograms were used as inputs for the time history motions which are in complying with EC8. The output results of experimental works expressed regarding displacements and accelerations are compared with those obtained from a conventional fixed-base numerical model. Four-time history was applied in both experimental and numerical models, and they concluded that the experimental has an acceptable output accuracy in compare with the numerical model output. Therefore this modelling methodology is valid and qualified for different shaking table experiments tests.Keywords: structure, raft, soil, interaction
Procedia PDF Downloads 1375386 A Study on the Assessment of Prosthetic Infection after Total Knee Replacement Surgery
Authors: Chun-Lang Chang, Chun-Kai Liu
Abstract:
In this study, the patients that have undergone total knee replacement surgery from the 2010 National Health Insurance database were adopted as the study participants. The important factors were screened and selected through literature collection and interviews with physicians. Through the Cross Entropy Method (CE), Genetic Algorithm Logistic Regression (GALR), and Particle Swarm Optimization (PSO), the weights of the factors were obtained. In addition, the weights of the respective algorithms, coupled with the Excel VBA were adopted to construct the Case Based Reasoning (CBR) system. The results through statistical tests show that the GALR and PSO produced no significant differences, and the accuracy of both models were above 97%. Moreover, the area under the curve of ROC for these two models also exceeded 0.87. This study shall serve as a reference for medical staff as an assistance for clinical assessment of infections in order to effectively enhance medical service quality and efficiency, avoid unnecessary medical waste, and substantially contribute to resource allocations in medical institutions.Keywords: Case Based Reasoning, Cross Entropy Method, Genetic Algorithm Logistic Regression, Particle Swarm Optimization, Total Knee Replacement Surgery
Procedia PDF Downloads 3265385 Forecasting Age-Specific Mortality Rates and Life Expectancy at Births for Malaysian Sub-Populations
Authors: Syazreen N. Shair, Saiful A. Ishak, Aida Y. Yusof, Azizah Murad
Abstract:
In this paper, we forecast age-specific Malaysian mortality rates and life expectancy at births by gender and ethnic groups including Malay, Chinese and Indian. Two mortality forecasting models are adopted the original Lee-Carter model and its recent modified version, the product ratio coherent model. While the first forecasts the mortality rates for each subpopulation independently, the latter accounts for the relationship between sub-populations. The evaluation of both models is performed using the out-of-sample forecast errors which are mean absolute percentage errors (MAPE) for mortality rates and mean forecast errors (MFE) for life expectancy at births. The best model is then used to perform the long-term forecasts up to the year 2030, the year when Malaysia is expected to become an aged nation. Results suggest that in terms of overall accuracy, the product ratio model performs better than the original Lee-Carter model. The association of lower mortality group (Chinese) in the subpopulation model can improve the forecasts of high mortality groups (Malay and Indian).Keywords: coherent forecasts, life expectancy at births, Lee-Carter model, product-ratio model, mortality rates
Procedia PDF Downloads 2235384 Application of Stochastic Models to Annual Extreme Streamflow Data
Authors: Karim Hamidi Machekposhti, Hossein Sedghi
Abstract:
This study was designed to find the best stochastic model (using of time series analysis) for annual extreme streamflow (peak and maximum streamflow) of Karkheh River at Iran. The Auto-regressive Integrated Moving Average (ARIMA) model used to simulate these series and forecast those in future. For the analysis, annual extreme streamflow data of Jelogir Majin station (above of Karkheh dam reservoir) for the years 1958–2005 were used. A visual inspection of the time plot gives a little increasing trend; therefore, series is not stationary. The stationarity observed in Auto-Correlation Function (ACF) and Partial Auto-Correlation Function (PACF) plots of annual extreme streamflow was removed using first order differencing (d=1) in order to the development of the ARIMA model. Interestingly, the ARIMA(4,1,1) model developed was found to be most suitable for simulating annual extreme streamflow for Karkheh River. The model was found to be appropriate to forecast ten years of annual extreme streamflow and assist decision makers to establish priorities for water demand. The Statistical Analysis System (SAS) and Statistical Package for the Social Sciences (SPSS) codes were used to determinate of the best model for this series.Keywords: stochastic models, ARIMA, extreme streamflow, Karkheh river
Procedia PDF Downloads 1515383 Simulation of the FDA Centrifugal Blood Pump Using High Performance Computing
Authors: Mehdi Behbahani, Sebastian Rible, Charles Moulinec, Yvan Fournier, Mike Nicolai, Paolo Crosetto
Abstract:
Computational Fluid Dynamics blood-flow simulations are increasingly used to develop and validate blood-contacting medical devices. This study shows that numerical simulations can provide additional and accurate estimates of relevant hemodynamic indicators (e.g., recirculation zones or wall shear stresses), which may be difficult and expensive to obtain from in-vivo or in-vitro experiments. The most recent FDA (Food and Drug Administration) benchmark consisted of a simplified centrifugal blood pump model that contains fluid flow features as they are commonly found in these devices with a clear focus on highly turbulent phenomena. The FDA centrifugal blood pump study is composed of six test cases with different volumetric flow rates ranging from 2.5 to 7.0 liters per minute, pump speeds, and Reynolds numbers ranging from 210,000 to 293,000. Within the frame of this study different turbulence models were tested including RANS models, e.g. k-omega, k-epsilon and a Reynolds Stress Model (RSM) and, LES. The partitioners Hilbert, METIS, ParMETIS and SCOTCH were used to create an unstructured mesh of 76 million elements and compared in their efficiency. Computations were performed on the JUQUEEN BG/Q architecture applying the highly parallel flow solver Code SATURNE and typically using 32768 or more processors in parallel. Visualisations were performed by means of PARAVIEW. Different turbulence models including all six flow situations could be successfully analysed and validated against analytical considerations and from comparison to other data-bases. It showed that an RSM represents an appropriate choice with respect to modeling high-Reynolds number flow cases. Especially, the Rij-SSG (Speziale, Sarkar, Gatzki) variant turned out to be a good approach. Visualisation of complex flow features could be obtained and the flow situation inside the pump could be characterized.Keywords: blood flow, centrifugal blood pump, high performance computing, scalability, turbulence
Procedia PDF Downloads 3855382 Parameters Adjustment of the Modified UBCSand Constitutive Model for the Potentially Liquefiable Sands of Santiago de Cali-Colombia
Authors: Daniel Rosero, Johan S. Arana, Sebastian Arango, Alejandro Cruz, Isabel Gomez-Gutierrez, Peter Thomson
Abstract:
Santiago de Cali is located in the southwestern Colombia in a high seismic hazard zone. About 50% of the city is on the banks of the Cauca River, which is the second most important hydric affluent in the country and whose alluvial deposits contain potentially liquefiable sands. Among the methods used to study a site's liquefaction potential is the finite elements method which use constitutive models to simulate the soil response for different load types. Among the different constitutive models, the Modified UBCSand stands out to study the seismic behavior of sands, and especially the liquefaction phenomenon. In this paper, the dynamic behavior of a potentially liquefiable sand of Santiago de Cali is studied by cyclic triaxial and CPTu tests. Subsequently, the behavior of the sand is simulated using the Modified UBCSand constitutive model, whose parameters are calibrated using the results of cyclic triaxial and CPTu tests. The above with the aim of analyze the constitutive model applicability for studying the geotechnical problems associated to liquefaction in the city.Keywords: constitutive model, cyclic triaxial test, dynamic behavior, liquefiable sand, modified ubcsand
Procedia PDF Downloads 2745381 Some Accuracy Related Aspects in Two-Fluid Hydrodynamic Sub-Grid Modeling of Gas-Solid Riser Flows
Authors: Joseph Mouallem, Seyed Reza Amini Niaki, Norman Chavez-Cussy, Christian Costa Milioli, Fernando Eduardo Milioli
Abstract:
Sub-grid closures for filtered two-fluid models (fTFM) useful in large scale simulations (LSS) of riser flows can be derived from highly resolved simulations (HRS) with microscopic two-fluid modeling (mTFM). Accurate sub-grid closures require accurate mTFM formulations as well as accurate correlation of relevant filtered parameters to suitable independent variables. This article deals with both of those issues. The accuracy of mTFM is touched by assessing the impact of gas sub-grid turbulence over HRS filtered predictions. A gas turbulence alike effect is artificially inserted by means of a stochastic forcing procedure implemented in the physical space over the momentum conservation equation of the gas phase. The correlation issue is touched by introducing a three-filtered variable correlation analysis (three-marker analysis) performed under a variety of different macro-scale conditions typical or risers. While the more elaborated correlation procedure clearly improved accuracy, accounting for gas sub-grid turbulence had no significant impact over predictions.Keywords: fluidization, gas-particle flow, two-fluid model, sub-grid models, filtered closures
Procedia PDF Downloads 1305380 I Can’t Escape the Scars, Even If I Do Get Better”: A Discourse Analysis of Adolescent Talk About Their Self-Harm During Cognitive-Behavioural Therapy Sessions for Major Depressive Disorder
Authors: Anna Kristen
Abstract:
There has been a pronounced increase in societal discourses around adolescent self-harm, yet there is a paucity of literature examining adolescent talk about self-harm that accounts for the sociocultural context. The objective of this study was to explore how adolescents with Depression talk about their self-harm engagement in consideration of both socio-cultural discourses and the therapy context during Cognitive-Behavioural Therapy (CBT) sessions. Utilizing a sample from the Improving Mood with Psychoanalytic and Cognitive Therapies study, discourse analysis was carried out on audio-recorded CBT sessions. The study established three groupings of results: (a) adolescent positioning as stuck in self-harm engagement; (b) adolescent positioning as ambivalent in the talk about ceasing self-harm; and (c) adolescent use of stigma discourses in self-harm talk & constructions of self-harm scars. These findings indicate that clinician awareness of adolescent use of language and discourse may inform interventions beyond Manualized CBT strategies. These findings are highly relevant in light of research that demonstrates CBT treatment for adolescent depression does not effectively address concurring self-harm and given that self-harm is the most significant risk factor predictive of subsequent suicidal behaviours.Keywords: adolescence, cognitive-behavioral therapy, discourse, self-harm, stigma
Procedia PDF Downloads 2525379 Visualizing the Commercial Activity of a City by Analyzing the Data Information in Layers
Authors: Taras Agryzkov, Jose L. Oliver, Leandro Tortosa, Jose Vicent
Abstract:
This paper aims to demonstrate how network models can be used to understand and to deal with some aspects of urban complexity. As it is well known, the Theory of Architecture and Urbanism has been using for decades’ intellectual tools based on the ‘sciences of complexity’ as a strategy to propose theoretical approaches about cities and about architecture. In this sense, it is possible to find a vast literature in which for instance network theory is used as an instrument to understand very diverse questions about cities: from their commercial activity to their heritage condition. The contribution of this research consists in adding one step of complexity to this process: instead of working with one single primal graph as it is usually done, we will show how new network models arise from the consideration of two different primal graphs interacting in two layers. When we model an urban network through a mathematical structure like a graph, the city is usually represented by a set of nodes and edges that reproduce its topology, with the data generated or extracted from the city embedded in it. All this information is normally displayed in a single layer. Here, we propose to separate the information in two layers so that we can evaluate the interaction between them. Besides, both layers may be composed of structures that do not have to coincide: from this bi-layer system, groups of interactions emerge, suggesting reflections and in consequence, possible actions.Keywords: graphs, mathematics, networks, urban studies
Procedia PDF Downloads 1875378 Uncovering Underwater Communication for Multi-Robot Applications via CORSICA
Authors: Niels Grataloup, Micael S. Couceiro, Manousos Valyrakis, Javier Escudero, Patricia A. Vargas
Abstract:
This paper benchmarks the possible underwater communication technologies that can be integrated into a swarm of underwater robots by proposing an underwater robot simulator named CORSICA (Cross platfORm wireleSs communICation simulator). Underwater exploration relies increasingly on the use of mobile robots, called Autonomous Underwater Vehicles (AUVs). These robots are able to reach goals in harsh underwater environments without resorting to human divers. The introduction of swarm robotics in these scenarios would facilitate the accomplishment of complex tasks with lower costs. However, swarm robotics requires implementation of communication systems to be operational and have a non-deterministic behaviour. Inter-robot communication is one of the key challenges in swarm robotics, especially in underwater scenarios, as communication must cope with severe restrictions and perturbations. This paper starts by presenting a list of the underwater propagation models of acoustic and electromagnetic waves, it also reviews existing transmitters embedded in current robots and simulators. It then proposes CORSICA, which allows validating the choices in terms of protocol and communication strategies, whether they are robot-robot or human-robot interactions. This paper finishes with a presentation of possible integration according to the literature review, and the potential to get CORSICA at an industrial level.Keywords: underwater simulator, robot-robot underwater communication, swarm robotics, transceiver and communication models
Procedia PDF Downloads 304