Search results for: resilience optimization model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19750

Search results for: resilience optimization model

14830 A Clustering-Sequencing Approach to the Facility Layout Problem

Authors: Saeideh Salimpour, Sophie-Charlotte Viaux, Ahmed Azab, Mohammed Fazle Baki

Abstract:

The Facility Layout Problem (FLP) is key to the efficient and cost-effective operation of a system. This paper presents a hybrid heuristic- and mathematical-programming-based approach that divides the problem conceptually into those of clustering and sequencing. First, clusters of vertically aligned facilities are formed, which are later on sequenced horizontally. The developed methodology provides promising results in comparison to its counterparts in the literature by minimizing the inter-distances for facilities which have more interactions amongst each other and aims at placing the facilities with more interactions at the centroid of the shop.

Keywords: clustering-sequencing approach, mathematical modeling, optimization, unequal facility layout problem

Procedia PDF Downloads 334
14829 An Electrocardiography Deep Learning Model to Detect Atrial Fibrillation on Clinical Application

Authors: Jui-Chien Hsieh

Abstract:

Background:12-lead electrocardiography(ECG) is one of frequently-used tools to detect atrial fibrillation (AF), which might degenerate into life-threaten stroke, in clinical Practice. Based on this study, the AF detection by the clinically-used 12-lead ECG device has only 0.73~0.77 positive predictive value (ppv). Objective: It is on great demand to develop a new algorithm to improve the precision of AF detection using 12-lead ECG. Due to the progress on artificial intelligence (AI), we develop an ECG deep model that has the ability to recognize AF patterns and reduce false-positive errors. Methods: In this study, (1) 570-sample 12-lead ECG reports whose computer interpretation by the ECG device was AF were collected as the training dataset. The ECG reports were interpreted by 2 senior cardiologists, and confirmed that the precision of AF detection by the ECG device is 0.73.; (2) 88 12-lead ECG reports whose computer interpretation generated by the ECG device was AF were used as test dataset. Cardiologist confirmed that 68 cases of 88 reports were AF, and others were not AF. The precision of AF detection by ECG device is about 0.77; (3) A parallel 4-layer 1 dimensional convolutional neural network (CNN) was developed to identify AF based on limb-lead ECGs and chest-lead ECGs. Results: The results indicated that this model has better performance on AF detection than traditional computer interpretation of the ECG device in 88 test samples with 0.94 ppv, 0.98 sensitivity, 0.80 specificity. Conclusions: As compared to the clinical ECG device, this AI ECG model promotes the precision of AF detection from 0.77 to 0.94, and can generate impacts on clinical applications.

Keywords: 12-lead ECG, atrial fibrillation, deep learning, convolutional neural network

Procedia PDF Downloads 117
14828 Field-observed Thermal Fractures during Reinjection and Its Numerical Simulation

Authors: Wen Luo, Phil J. Vardon, Anne-Catherine Dieudonne

Abstract:

One key process that partly controls the success of geothermal projects is fluid reinjection, which benefits in dealing with waste water, maintaining reservoir pressure, and supplying heat-exchange media, etc. Thus, sustaining the injectivity is of great importance for the efficiency and sustainability of geothermal production. However, the injectivity is sensitive to the reinjection process. Field experiences have illustrated that the injectivity can be damaged or improved. In this paper, the focus is on how the injectivity is improved. Since the injection pressure is far below the formation fracture pressure, hydraulic fracturing cannot be the mechanism contributing to the increase in injectivity. Instead, thermal stimulation has been identified as the main contributor to improving the injectivity. For low-enthalpy geothermal reservoirs, which are not fracture-controlled, thermal fracturing, instead of thermal shearing, is expected to be the mechanism for increasing injectivity. In this paper, field data from the sedimentary low-enthalpy geothermal reservoirs in the Netherlands were analysed to show the occurrence of thermal fracturing due to the cooling shock during reinjection. Injection data were collected and compared to show the effects of the thermal fractures on injectivity. Then, a thermo-hydro-mechanical (THM) model for the near field formation was developed and solved by finite element method to simulate the observed thermal fractures. It was then compared with the HM model, decomposed from the THM model, to illustrate the thermal effects on thermal fracturing. Finally, the effects of operational parameters, i.e. injection temperature and pressure, on the changes in injectivity were studied on the basis of the THM model. The field data analysis and simulation results illustrate that the thermal fracturing occurred during reinjection and contributed to the increase in injectivity. The injection temperature was identified as a key parameter that contributes to thermal fracturing.

Keywords: injectivity, reinjection, thermal fracturing, thermo-hydro-mechanical model

Procedia PDF Downloads 220
14827 An Assessment into the Drift in Direction of International Migration of Labor: Changing Aspirations for Religiosity and Cultural Assimilation

Authors: Syed Toqueer Akhter, Rabia Zulfiqar

Abstract:

This paper attempts to trace the determining factor- as far as individual preferences and expectations are concerned- of what causes the direction of international migration to drift in certain ways owing to factors such as Religiosity and Cultural Assimilation. The narrative on migration has graduated from the age long ‘push/pull’ debate to that of complex factors that may vary across each individual. We explore the longstanding factor of religiosity widely acknowledged in mentioned literature as a key variable in the assessment of migration, wherein the impact of religiosity in the form of a drift into the intent of migration has been analyzed. A more conventional factor cultural assimilation is used in a contemporary way to estimate how it plays a role in affecting the drift in direction. In particular what our research aims at achieving is to isolate the effect our key variables: Cultural Assimilation and Religiosity have on direction of migration, and to explore how they interplay as a composite unit- and how we may be able to justify the change in behavior displayed by these key variables. In order to establish a true sense of what drives individual choices we employ the method of survey research and use a questionnaire to conduct primary research. The questionnaire was divided into six sections covering subjects including household characteristics, perceptions and inclinations of the respondents relevant to our study. Religiosity was quantified using a proxy of Migration Network that utilized secondary data to estimate religious hubs in recipient countries. To estimate the relationship between Intent of Migration and its variants three competing econometric models namely: the Ordered Probit Model, the Ordered Logit Model and the Tobit Model were employed. For every model that included our key variables, a highly significant relationship with the intent of migration was estimated.

Keywords: international migration, drift in direction, cultural assimilation, religiosity, ordered probit model

Procedia PDF Downloads 310
14826 Dynamic Determination of Spare Engine Requirements for Air Fighters Integrating Feedback of Operational Information

Authors: Tae Bo Jeon

Abstract:

Korean air force is undertaking a big project to replace prevailing hundreds of old air fighters such as F-4, F-5, KF-16 etc. The task is to develop and produce domestic fighters equipped with 2 complete-type engines each. A large number of engines, however, will be purchased as products from a foreign engine maker. In addition to the fighters themselves, secure the proper number of spare engines serves a significant role in maintaining combat readiness and effectively managing the national defense budget due to high cost. In this paper, we presented a model dynamically updating spare engine requirements. Currently, the military administration purchases all the fighters, engines, and spare engines at acquisition stage and does not have additional procurement processes during the life cycle, 30-40 years. With the assumption that procurement procedure during the operational stage is established, our model starts from the initial estimate of spare engine requirements based on limited information. The model then performs military missions and repair/maintenance works when necessary. During operation, detailed field information - aircraft repair and test, engine repair, planned maintenance, administration time, transportation pipeline between base, field, and depot etc., - should be considered for actual engine requirements. At the end of each year, the performance measure is recorded and proceeds to next year when it shows higher the threshold set. Otherwise, additional engine(s) will be bought and added to the current system. We repeat the process for the life cycle period and compare the results. The proposed model is seen to generate far better results appropriately adding spare engines thus avoiding possible undesirable situations. Our model may well be applied to future air force military operations.

Keywords: DMSMS, operational availability, METRIC, PRS

Procedia PDF Downloads 176
14825 Finite Element Analysis of the Blanking and Stamping Processes of Nuclear Fuel Spacer Grids

Authors: Rafael Oliveira Santos, Luciano Pessanha Moreira, Marcelo Costa Cardoso

Abstract:

Spacer grid assembly supporting the nuclear fuel rods is an important concern in the design of structural components of a Pressurized Water Reactor (PWR). The spacer grid is composed by springs and dimples which are formed from a strip sheet by means of blanking and stamping processes. In this paper, the blanking process and tooling parameters are evaluated by means of a 2D plane-strain finite element model in order to evaluate the punch load and quality of the sheared edges of Inconel 718 strips used for nuclear spacer grids. A 3D finite element model is also proposed to predict the tooling loads resulting from the stamping process of a preformed Inconel 718 strip and to analyse the residual stress effects upon the spring and dimple design geometries of a nuclear spacer grid.

Keywords: blanking process, damage model, finite element modelling, inconel 718, spacer grids, stamping process

Procedia PDF Downloads 350
14824 Short-Term Energy Efficiency Decay and Risk Analysis of Ground Source Heat Pump System

Authors: Tu Shuyang, Zhang Xu, Zhou Xiang

Abstract:

The objective of this paper is to investigate the effect of short-term heat exchange decay of ground heat exchanger (GHE) on the ground source heat pump (GSHP) energy efficiency and capacity. A resistance-capacitance (RC) model was developed and adopted to simulate the transient characteristics of the ground thermal condition and heat exchange. The capacity change of the GSHP was linked to the inlet and outlet water temperature by polynomial fitting according to measured parameters given by heat pump manufacturers. Thus, the model, which combined the heat exchange decay with the capacity change, reflected the energy efficiency decay of the whole system. A case of GSHP system was analyzed by the model, and the result showed that there was risk that the GSHP might not meet the load demand because of the efficiency decay in a short-term operation. The conclusion would provide some guidances for GSHP system design to overcome the risk.

Keywords: capacity, energy efficiency, GSHP, heat exchange

Procedia PDF Downloads 353
14823 Financial Regulation and the Twin Peaks Model in a Developing and Developed Country Contexts: An Institutional Theory Perspective

Authors: Pumela Msweli, Dexter L. Ryneveldt

Abstract:

This paper seeks to shed light on institutional logics and institutionalization processes that influence the successful implementation of financial sector regulations. We use the neo-institutional theory lens to interrogate how the newly promulgated Financial Sector Regulations Act (FSRA) provides for the institutionalisation of the Twin Peaks Model. With the enactment of FSRA, previous financial regulatory institutions were dismantled, and new financial regulators established. In point, the Financial Services Conduct Authority (FSCA) replaced the Financial Services Board (FSB), and accordingly, the Prudential Authority (PA) was established. FSRA is layered with complexities that make it mandatory to co-exist, cooperate, and collaborate with other institutions to fulfill FSRA’s overall financial stability objective. We use content analysis of the financial regulations that established the Twin Peaks Models (TPM) in South Africa and in the Netherlands, to map out the three-stage institutionalization processes: (1) habitualisation, (2) objectification and (3) sedimentation. This allowed for a comparison of how South Africa, as a developing country and Netherlands as a developed country, have institutionalized the Twin Peak model. We provide valuable insights into how differences in the institutional and societal logics of the developing and developed contexts shape the institutionalization of financial regulations.

Keywords: financial industry, financial regulation, financial stability, institutionalisation, habitualization, objectification, sedimentation, twin peaks model

Procedia PDF Downloads 163
14822 Biomechanical Study of a Type II Superior Labral Anterior to Posterior Lesion in the Glenohumeral Joint Using Finite Element Analysis

Authors: Javier A. Maldonado E., Duvert A. Puentes T., Diego F. Villegas B.

Abstract:

The SLAP lesion (Superior Labral Anterior to Posterior) involves the labrum, causing pain and mobility problems in the glenohumeral joint. This injury is common in athletes practicing sports that requires throwing or those who receive traumatic impacts on the shoulder area. This paper determines the biomechanical behavior of soft tissues of the glenohumeral joint when type II SLAP lesion is present. This pathology is characterized for a tear in the superior labrum which is simulated in a 3D model of the shoulder joint. A 3D model of the glenohumeral joint was obtained using the free software Slice. Then, a Finite Element analysis was done using a general purpose software which simulates a compression test with external rotation. First, a validation was done assuming a healthy joint shoulder with a previous study. Once the initial model was validated, a lesion of the labrum built using a CAD software and the same test was done again. The results obtained were stress and strain distribution of the synovial capsule and the injured labrum. ANOVA was done for the healthy and injured glenohumeral joint finding significant differences between them. This study will help orthopedic surgeons to know the biomechanics involving this type of lesion and also the other surrounding structures affected by loading the injured joint.

Keywords: biomechanics, computational model, finite elements, glenohumeral joint, superior labral anterior to posterior lesion

Procedia PDF Downloads 211
14821 Evaluation of Green Infrastructure with Different Woody Plants Practice and Benefit Using the Stormwater Management-HYDRUS Model

Authors: Bei Zhang, Zhaoxin Zhang, Lidong Zhao

Abstract:

Green infrastructures (GIs) for rainwater management can directly meet the multiple purposes of urban greening and non-point source pollution control. To reveal the overall layout law of GIs dominated by typical woody plants and their impact on urban environmental effects, we constructed a HYDRUS-1D and Stormwater management (SWMM) coupling model to simulate the response of typical root woody plant planting methods on urban hydrological. The results showed that the coupling model had high adaptability to the simulation of urban surface runoff control effect under different woody plant planting methods (NSE ≥0.64 and R² ≥ 0.71). The regulation effect on surface runoff showed that the average runoff reduction rate of GIs increased from 60 % to 71 % with the increase of planting area (5% to 25%) under the design rainfall event of the 2-year recurrence interval. Sophora japonica with tap roots was slightly higher than that of without plants (control) and Malus baccata (M. baccata) with fibrous roots. The comprehensive benefit evaluation system of rainwater utilization technology was constructed by using an analytic hierarchy process. The coupling model was used to evaluate the comprehensive benefits of woody plants with different planting areas in the study area in terms of environment, economy, and society. The comprehensive benefit value of planting 15% M. baccata was the highest, which was the first choice for the planting of woody plants in the study area. This study can provide a scientific basis for the decision-making of green facility layouts of woody plants.

Keywords: green infrastructure, comprehensive benefits, runoff regulation, woody plant layout, coupling model

Procedia PDF Downloads 72
14820 Estimation of Functional Response Model by Supervised Functional Principal Component Analysis

Authors: Hyon I. Paek, Sang Rim Kim, Hyon A. Ryu

Abstract:

In functional linear regression, one typical problem is to reduce dimension. Compared with multivariate linear regression, functional linear regression is regarded as an infinite-dimensional case, and the main task is to reduce dimensions of functional response and functional predictors. One common approach is to adapt functional principal component analysis (FPCA) on functional predictors and then use a few leading functional principal components (FPC) to predict the functional model. The leading FPCs estimated by the typical FPCA explain a major variation of the functional predictor, but these leading FPCs may not be mostly correlated with the functional response, so they may not be significant in the prediction for response. In this paper, we propose a supervised functional principal component analysis method for a functional response model with FPCs obtained by considering the correlation of the functional response. Our method would have a better prediction accuracy than the typical FPCA method.

Keywords: supervised, functional principal component analysis, functional response, functional linear regression

Procedia PDF Downloads 82
14819 Analyzing Damage of the Cutting Tools out of Carbide Metallic during the Turning of a Soaked and Not Hardened Steel XC38

Authors: Mohamed Seghouani, Ahmed Tafraoui, Soltane Lebaili

Abstract:

The purpose of this study widened knowledge on the use of the cutting tools out of metal carbide and to define it the influence of the elements of the mode of cut on the behavior of these tools during the machining of treated steel XC38 and untreated. This work aims at evolution determined in experiments of the wear of a cutting tool out of metal carbide with plate reported of P30 nuance for an operation of slide-lathing in turning on soaked and not hardened steel XC38 test-tubes. This research is based on the model of Taylor to determine the life span of the cutting tool according to the various parameters of cut, like the cutting speed Vc, the advance of cut a, the depth of cutting P. In order to express the operational limits of the tool for slide-lathing in a preventive way. The model makes it possible to determine the time of change of the tool and to regard it as a constraint for the respect of the roughness of the workpiece during a work of series in conventional machining.

Keywords: machining, wear, lifespan, model of Taylor, cutting tool, carburize metal

Procedia PDF Downloads 395
14818 SISSLE in Consensus-Based Ripple: Some Improvements in Speed, Security, Last Mile Connectivity and Ease of Use

Authors: Mayank Mundhra, Chester Rebeiro

Abstract:

Cryptocurrencies are rapidly finding wide application in areas such as Real Time Gross Settlements and Payments Systems. Ripple is a cryptocurrency that has gained prominence with banks and payment providers. It solves the Byzantine General’s Problem with its Ripple Protocol Consensus Algorithm (RPCA), where each server maintains a list of servers, called Unique Node List (UNL) that represents the network for the server, and will not collectively defraud it. The server believes that the network has come to a consensus when members of the UNL come to a consensus on a transaction. In this paper we improve Ripple to achieve better speed, security, last mile connectivity and ease of use. We implement guidelines and automated systems for building and maintaining UNLs for resilience, robustness, improved security, and efficient information propagation. We enhance the system so as to ensure that each server receives information from across the whole network rather than just from the UNL members. We also introduce the paradigm of UNL overlap as a function of information propagation and the trust a server assigns to its own UNL. Our design not only reduces vulnerabilities such as eclipse attacks, but also makes it easier to identify malicious behaviour and entities attempting to fraudulently Double Spend or stall the system. We provide experimental evidence of the benefits of our approach over the current Ripple scheme. We observe ≥ 4.97x and 98.22x in speedup and success rate for information propagation respectively, and ≥ 3.16x and 51.70x in speedup and success rate in consensus.

Keywords: Ripple, Kelips, unique node list, consensus, information propagation

Procedia PDF Downloads 153
14817 A Statistical Energy Analysis Model of an Automobile for the Prediction of the Internal Sound Pressure Level

Authors: El Korchi Ayoub, Cherif Raef

Abstract:

Interior noise in vehicles is an essential factor affecting occupant comfort. Over recent decades, much work has been done to develop simulation tools for vehicle NVH. At the medium high-frequency range, the statistical energy analysis method (SEA) shows significant effectiveness in predicting noise and vibration responses of mechanical systems. In this paper, the evaluation of the sound pressure level (SPL) inside an automobile cabin has been performed numerically using the statistical energy analysis (SEA) method. A test car cabin was performed using a monopole source as a sound source. The decay rate method was employed to obtain the damping loss factor (DLF) of each subsystem of the developed SEA model. These parameters were then used to predict the sound pressure level in the interior cabin. The results show satisfactory agreement with the directly measured SPL. The developed SEA vehicle model can be used in early design phases and allows the engineer to identify sources contributing to the total noise and transmission paths.

Keywords: SEA, SPL, DLF, NVH

Procedia PDF Downloads 95
14816 Wildland Fire in Terai Arc Landscape of Lesser Himalayas Threatning the Tiger Habitat

Authors: Amit Kumar Verma

Abstract:

The present study deals with fire prediction model in Terai Arc Landscape, one of the most dramatic ecosystems in Asia where large, wide-ranging species such as tiger, rhinos, and elephant will thrive while bringing economic benefits to the local people. Forest fires cause huge economic and ecological losses and release considerable quantities of carbon into the air and is an important factor inflating the global burden of carbon emissions. Forest fire is an important factor of behavioral cum ecological habit of tiger in wild. Post fire changes i.e. micro and macro habitat directly affect the tiger habitat or land. Vulnerability of fire depicts the changes in microhabitat (humus, soil profile, litter, vegetation, grassland ecosystem). Microorganism like spider, annelids, arthropods and other favorable microorganism directly affect by the forest fire and indirectly these entire microorganisms are responsible for the development of tiger (Panthera tigris) habitat. On the other hand, fire brings depletion in prey species and negative movement of tiger from wild to human- dominated areas, which may leads the conflict i.e. dangerous for both tiger & human beings. Early forest fire prediction through mapping the risk zones can help minimize the fire frequency and manage forest fires thereby minimizing losses. Satellite data plays a vital role in identifying and mapping forest fire and recording the frequency with which different vegetation types are affected. Thematic hazard maps have been generated by using IDW technique. A prediction model for fire occurrence is developed for TAL. The fire occurrence records were collected from state forest department from 2000 to 2014. Disciminant function models was used for developing a prediction model for forest fires in TAL, random points for non-occurrence of fire have been generated. Based on the attributes of points of occurrence and non-occurrence, the model developed predicts the fire occurrence. The map of predicted probabilities classified the study area into five classes very high (12.94%), high (23.63%), moderate (25.87%), low(27.46%) and no fire (10.1%) based upon the intensity of hazard. model is able to classify 78.73 percent of points correctly and hence can be used for the purpose with confidence. Overall, also the model works correctly with almost 69% of points. This study exemplifies the usefulness of prediction model of forest fire and offers a more effective way for management of forest fire. Overall, this study depicts the model for conservation of tiger’s natural habitat and forest conservation which is beneficial for the wild and human beings for future prospective.

Keywords: fire prediction model, forest fire hazard, GIS, landsat, MODIS, TAL

Procedia PDF Downloads 355
14815 High Fidelity Interactive Video Segmentation Using Tensor Decomposition, Boundary Loss, Convolutional Tessellations, and Context-Aware Skip Connections

Authors: Anthony D. Rhodes, Manan Goel

Abstract:

We provide a high fidelity deep learning algorithm (HyperSeg) for interactive video segmentation tasks using a dense convolutional network with context-aware skip connections and compressed, 'hypercolumn' image features combined with a convolutional tessellation procedure. In order to maintain high output fidelity, our model crucially processes and renders all image features in high resolution, without utilizing downsampling or pooling procedures. We maintain this consistent, high grade fidelity efficiently in our model chiefly through two means: (1) we use a statistically-principled, tensor decomposition procedure to modulate the number of hypercolumn features and (2) we render these features in their native resolution using a convolutional tessellation technique. For improved pixel-level segmentation results, we introduce a boundary loss function; for improved temporal coherence in video data, we include temporal image information in our model. Through experiments, we demonstrate the improved accuracy of our model against baseline models for interactive segmentation tasks using high resolution video data. We also introduce a benchmark video segmentation dataset, the VFX Segmentation Dataset, which contains over 27,046 high resolution video frames, including green screen and various composited scenes with corresponding, hand-crafted, pixel-level segmentations. Our work presents a improves state of the art segmentation fidelity with high resolution data and can be used across a broad range of application domains, including VFX pipelines and medical imaging disciplines.

Keywords: computer vision, object segmentation, interactive segmentation, model compression

Procedia PDF Downloads 123
14814 Risk Factors for Defective Autoparts Products Using Bayesian Method in Poisson Generalized Linear Mixed Model

Authors: Pitsanu Tongkhow, Pichet Jiraprasertwong

Abstract:

This research investigates risk factors for defective products in autoparts factories. Under a Bayesian framework, a generalized linear mixed model (GLMM) in which the dependent variable, the number of defective products, has a Poisson distribution is adopted. Its performance is compared with the Poisson GLM under a Bayesian framework. The factors considered are production process, machines, and workers. The products coded RT50 are observed. The study found that the Poisson GLMM is more appropriate than the Poisson GLM. For the production Process factor, the highest risk of producing defective products is Process 1, for the Machine factor, the highest risk is Machine 5, and for the Worker factor, the highest risk is Worker 6.

Keywords: defective autoparts products, Bayesian framework, generalized linear mixed model (GLMM), risk factors

Procedia PDF Downloads 572
14813 Innovative Approaches to Formal Education: Effect of Online Cooperative Learning Embedded Blended Learning on Student's Academic Achievement and Attitude

Authors: Mohsin Javed

Abstract:

School Education department is usually criticized for utilizing quite low or fewer academic days due to many reasons like extreme weather conditions, sudden holidays, summer vocations, pandemics and, terrorism etc. The purpose of the experimental study was to determine the efficacy of online cooperative learning (OCL) integrated in the rotation model of blended learning. The effects on academic achievement of students and students' attitude about OCL embedded learning were assessed. By using a posttest only control group design, sixty-two first-year students were randomly allocated to either the experimental (30) or control (32) group. The control group received face to face classes for six sessions per week, while the experimental group had three OCL and three formal sessions per week under rotation model. Students' perceptions of OCL were evaluated using a survey questionnaire. Data was analyzed by independent sample t test and one sample t test. According to findings, the intervention greatly improved the state of the dependent variables. The results demonstrate that OCL can be successfully implemented in formal education using a blended learning rotation approach. Higher secondary institutions are advised to use this model in situations like Covid 19, smog, unexpected holidays, instructor absence from class due to increased responsibilities, and summer vacations.

Keywords: blended learning, online cooperative learning, rotation model of blended learning, supplementing

Procedia PDF Downloads 62
14812 Introduction to Techno-Sectoral Innovation System Modeling and Functions Formulating

Authors: S. M. Azad, H. Ghodsi Pour, F. Roshannafasa

Abstract:

In recent years ‘technology management and policymaking’ is one of the most important problems in management science. In this field, different generations of innovation and technology management are presented which the earliest one is Innovation System (IS) approach. In a general classification, innovation systems are divided in to 4 approaches: Technical, sectoral, regional, and national. There are many researches in relation to each of these approaches in different academic fields. Every approach has some benefits. If two or more approaches hybrid, their benefits would be combined. In addition, according to the sectoral structure of the governance model in Iran, in many sectors such as information technology, the combination of three other approaches with sectoral approach is essential. Hence, in this paper, combining two IS approaches (technical and sectoral) and using system dynamics, a generic model is presented for a sample of software industry. As a complimentary point, this article is introducing a new hybrid approach called Techno-Sectoral Innovation System. This TSIS model is accomplished by Changing concepts of the ‘functions’ which came from Technological IS literature and using them into sectoral system as measurable indicators.

Keywords: innovation system, technology, techno-sectoral system, functional indicators, system dynamics

Procedia PDF Downloads 445
14811 Study of Heat Conduction in Multicore Chips

Authors: K. N. Seetharamu, Naveen Teggi, Kiranakumar Dhavalagi, Narayana Kamath

Abstract:

A method of temperature calculations is developed to study the conditions leading to hot spot occurrence on multicore chips. A physical model which has salient features of multicore chips is incorporated for the analysis. The model consists of active and background cell laid out in a checkered pattern, and this pattern repeats itself in each fine grain active cells. The die has three layers i) body ii) buried oxide layer iii) wiring layer, stacked one above the other with heat source placed at the interface between wiring and buried oxide layer. With this model we propose analytical method to calculate the target hotspot temperature, heat flow to top and bottom layers of the die and thermal resistance components at each granularity level, assuming appropriate values of die dimensions and parameters. Finally we attempt to find an easier method for the calculation of the target hotspot temperature using graph.

Keywords: checkered pattern, granularity level, heat conduction, multicore chips, target hotspot temperature

Procedia PDF Downloads 475
14810 Numerical and Experimental Study on Bed-Wall Heat Transfer in Conical Fluidized Bed Combustor

Authors: Ik–Tae Im, H. M. Abdelmotalib, M. A. Youssef, S. B. Young

Abstract:

In this study the flow characteristics and bed-to-wall heat transfer in a gas-solid conical fluidized bed combustor were investigated using both experimental and numerical methods. The computational fluid dynamic (CFD) simulations were carried out using a commercial software, Fluent V6.3. A two-fluid Eulerian-Eulerian model was applied in order to simulate the gas–solid flow and heat transfer in a conical sand-air bed with 30o con angle and 22 cm static bed height. Effect of different fluidizing number varying in the range of 1.5 - 2.3, drag models namely (Syamlal-O’Brien and Gidaspow), and friction viscosity on flow and bed-to-wall heat transfer were analyzed. Both bed pressure drop and heat transfer coefficient increased with increasing inlet gas velocity. The Gidaspow drag model showed a better agreement with experimental results than other drag model. The friction viscosity had no clear effect on both hydrodynamics and heat transfer.

Keywords: computational fluid dynamics, heat transfer coefficient, hydrodynamics, renewable energy

Procedia PDF Downloads 420
14809 Creativity as a National System: An Exploratory Model towards Enhance Innovation Ecosystems

Authors: Oscar Javier Montiel Mendez

Abstract:

The link between knowledge-creativity-innovation-entrepreneurship is well established, and broadly emphasized the importance of national innovation systems (NIS) as an approach stresses that the flow of information and technology among people, organizations and institutions are key to its process. Understanding the linkages among the actors involved in innovation is relevant to NIS. Creativity is supposed to fuel NIS, mainly focusing on a personal, group or organizational level, leaving aside the fourth one, as a national system. It is suggested that NIS takes Creativity for granted, an ex-ante stage already solved through some mechanisms, like programs for nurturing it at elementary and secondary schools, universities, or public/organizational specific programs. Or worse, that the individual already has this competence, and that the elements of the NIS will communicate between in a way that will lead to the creation of S curves, with an impact on national systems/programs on entrepreneurship, clusters, and the economy. But creativity constantly appears at any time during NIS, being the key input. Under an initial, exploratory, focused and refined literature review, based on Csikszentmihalyi’s systemic model, Amabile's componential theory, Kaufman and Beghetto’s 4C model, and the OECD’s (Organisation for Economic Co-operation and Development) NIS model (expanded), an NCS theoretical model is elaborated. Its suggested that its implementation could become a significant factor helping strengthen local, regional and national economies. The results also suggest that the establishment of a national creativity system (NCS), something that appears not been previously addressed, as a strategic/vital companion for a NIS, installing it not only as a national education strategy, but as its foundation, managing it and measuring its impact on NIS, entrepreneurship and the rest of the ecosystem, could make more effective public policies. Likewise, it should have a beneficial impact on the efforts of all the stakeholders involved and should help prevent some of the possible failures that NIS present.

Keywords: national creativity system, national innovation system, entrepreneurship ecosystem, systemic creativity

Procedia PDF Downloads 439
14808 Numerical Simulation of Convective Flow of Nanofluids with an Oriented Magnetic Field in a Half Circular-Annulus

Authors: M. J. Uddin, M. M. Rahman

Abstract:

The unsteady convective heat transfer flow of nanofluids in a half circular-annulus shape enclosure using nonhomogeneous dynamic model has been investigated numerically. The round upper wall of the enclosure is maintained at constant low temperature whereas the bottom wall is heated by three different thermal conditions. The enclosure is permeated by a uniform magnetic field having variable orientation. The Brownian motion and thermophoretic phenomena of the nanoparticles are taken into account in model construction. The governing nonlinear momentum, energy, and concentration equations are solved numerically using Galerkin weighted residual finite element method. To discover the best performer, the average Nusselt number is demonstrated for different types of nanofluids. The heat transfer rate for different flow parameters, positions of the annulus, thicknesses of the half circular-annulus and thermal conditions is also exhibited.

Keywords: nanofluid, convection, semicircular-annulus, nonhomogeneous dynamic model, finite element method

Procedia PDF Downloads 224
14807 Methodology for Temporary Analysis of Production and Logistic Systems on the Basis of Distance Data

Authors: M. Mueller, M. Kuehn, M. Voelker

Abstract:

In small and medium-sized enterprises (SMEs), the challenge is to create a well-grounded and reliable basis for process analysis, optimization and planning due to a lack of data. SMEs have limited access to methods with which they can effectively and efficiently analyse processes and identify cause-and-effect relationships in order to generate the necessary database and derive optimization potential from it. The implementation of digitalization within the framework of Industry 4.0 thus becomes a particular necessity for SMEs. For these reasons, the abstract presents an analysis methodology that is subject to the objective of developing an SME-appropriate methodology for efficient, temporarily feasible data collection and evaluation in flexible production and logistics systems as a basis for process analysis and optimization. The overall methodology focuses on retrospective, event-based tracing and analysis of material flow objects. The technological basis consists of Bluetooth low energy (BLE)-based transmitters, so-called beacons, and smart mobile devices (SMD), e.g. smartphones as receivers, between which distance data can be measured and derived motion profiles. The distance is determined using the Received Signal Strength Indicator (RSSI), which is a measure of signal field strength between transmitter and receiver. The focus is the development of a software-based methodology for interpretation of relative movements of transmitters and receivers based on distance data. The main research is on selection and implementation of pattern recognition methods for automatic process recognition as well as methods for the visualization of relative distance data. Due to an existing categorization of the database regarding process types, classification methods (e.g. Support Vector Machine) from the field of supervised learning are used. The necessary data quality requires selection of suitable methods as well as filters for smoothing occurring signal variations of the RSSI, the integration of methods for determination of correction factors depending on possible signal interference sources (columns, pallets) as well as the configuration of the used technology. The parameter settings on which respective algorithms are based have a further significant influence on result quality of the classification methods, correction models and methods for visualizing the position profiles used. The accuracy of classification algorithms can be improved up to 30% by selected parameter variation; this has already been proven in studies. Similar potentials can be observed with parameter variation of methods and filters for signal smoothing. Thus, there is increased interest in obtaining detailed results on the influence of parameter and factor combinations on data quality in this area. The overall methodology is realized with a modular software architecture consisting of independently modules for data acquisition, data preparation and data storage. The demonstrator for initialization and data acquisition is available as mobile Java-based application. The data preparation, including methods for signal smoothing, are Python-based with the possibility to vary parameter settings and to store them in the database (SQLite). The evaluation is divided into two separate software modules with database connection: the achievement of an automated assignment of defined process classes to distance data using selected classification algorithms and the visualization as well as reporting in terms of a graphical user interface (GUI).

Keywords: event-based tracing, machine learning, process classification, parameter settings, RSSI, signal smoothing

Procedia PDF Downloads 138
14806 Modeling of Building a Conceptual Scheme for Multimodal Freight Transportation Information System

Authors: Gia Surguladze, Nino Topuria, Lily Petriashvili, Giorgi Surguladze

Abstract:

Modeling of building processes of a multimodal freight transportation support information system is discussed based on modern CASE technologies. Functional efficiencies of ports in the eastern part of the Black Sea are analyzed taking into account their ecological, seasonal, resource usage parameters. By resources, we mean capacities of berths, cranes, automotive transport, as well as work crews and neighbouring airports. For the purpose of designing database of computer support system for Managerial (Logistics) function, using Object-Role Modeling (ORM) tool (NORMA – Natural ORM Architecture) is proposed, after which Entity Relationship Model (ERM) is generated in automated process. The software is developed based on Process-Oriented and Service-Oriented architecture, in Visual Studio.NET environment.

Keywords: seaport resources, business-processes, multimodal transportation, CASE technology, object-role model, entity relationship model, SOA

Procedia PDF Downloads 434
14805 Modern Scotland Yard: Improving Surveillance Policies Using Adversarial Agent-Based Modelling and Reinforcement Learning

Authors: Olaf Visker, Arnout De Vries, Lambert Schomaker

Abstract:

Predictive policing refers to the usage of analytical techniques to identify potential criminal activity. It has been widely implemented by various police departments. Being a relatively new area of research, there are, to the author’s knowledge, no absolute tried, and true methods and they still exhibit a variety of potential problems. One of those problems is closely related to the lack of understanding of how acting on these prediction influence crime itself. The goal of law enforcement is ultimately crime reduction. As such, a policy needs to be established that best facilitates this goal. This research aims to find such a policy by using adversarial agent-based modeling in combination with modern reinforcement learning techniques. It is presented here that a baseline model for both law enforcement and criminal agents and compare their performance to their respective reinforcement models. The experiments show that our smart law enforcement model is capable of reducing crime by making more deliberate choices regarding the locations of potential criminal activity. Furthermore, it is shown that the smart criminal model presents behavior consistent with popular crime theories and outperforms the baseline model in terms of crimes committed and time to capture. It does, however, still suffer from the difficulties of capturing long term rewards and learning how to handle multiple opposing goals.

Keywords: adversarial, agent based modelling, predictive policing, reinforcement learning

Procedia PDF Downloads 152
14804 Delineating Floodplain along the Nasia River in Northern Ghana Using HAND Contour

Authors: Benjamin K. Ghansah, Richard K. Appoh, Iliya Nababa, Eric K. Forkuo

Abstract:

The Nasia River is an important source of water for domestic and agricultural purposes to the inhabitants of its catchment. Major farming activities takes place within the floodplain of the river and its network of tributaries. The actual inundation extent of the river system is; however, unknown. Reasons for this lack of information include financial constraints and inadequate human resources as flood modelling is becoming increasingly complex by the day. Knowledge of the inundation extent will help in the assessment of risk posed by the annual flooding of the river, and help in the planning of flood recession agricultural activities. This study used a simple terrain based algorithm, Height Above Nearest Drainage (HAND), to delineate the floodplain of the Nasia River and its tributaries. The HAND model is a drainage normalized digital elevation model, which has its height reference based on the local drainage systems rather than the average mean sea level (AMSL). The underlying principle guiding the development of the HAND model is that hillslope flow paths behave differently when the reference gradient is to the local drainage network as compared to the seaward gradient. The new terrain model of the catchment was created using the NASA’s SRTM Digital Elevation Model (DEM) 30m as the only data input. Contours (HAND Contour) were then generated from the normalized DEM. Based on field flood inundation survey, historical information of flooding of the area as well as satellite images, a HAND Contour of 2m was found to best correlates with the flood inundation extent of the river and its tributaries. A percentage accuracy of 75% was obtained when the surface area created by the 2m contour was compared with surface area of the floodplain computed from a satellite image captured during the peak flooding season in September 2016. It was estimated that the flooding of the Nasia River and its tributaries created a floodplain area of 1011 km².

Keywords: digital elevation model, floodplain, HAND contour, inundation extent, Nasia River

Procedia PDF Downloads 459
14803 The Latent Model of Linguistic Features in Korean College Students’ L2 Argumentative Writings: Syntactic Complexity, Lexical Complexity, and Fluency

Authors: Jiyoung Bae, Gyoomi Kim

Abstract:

This study explores a range of linguistic features used in Korean college students’ argumentative writings for the purpose of developing a model that identifies variables which predict writing proficiencies. This study investigated the latent variable structure of L2 linguistic features, including syntactic complexity, the lexical complexity, and fluency. One hundred forty-six university students in Korea participated in this study. The results of the study’s confirmatory factor analysis (CFA) showed that indicators of linguistic features from this study-provided a foundation for re-categorizing indicators found in extant research on L2 Korean writers depending on each latent variable of linguistic features. The CFA models indicated one measurement model of L2 syntactic complexity and L2 learners’ writing proficiency; these two latent factors were correlated with each other. Based on the overall findings of the study, integrated linguistic features of L2 writings suggested some pedagogical implications in L2 writing instructions.

Keywords: linguistic features, syntactic complexity, lexical complexity, fluency

Procedia PDF Downloads 174
14802 The System of Uniform Criteria for the Characterization and Evaluation of Elements of Economic Structure: The Territory, Infrastructure, Processes, Technological Chains, the End Products

Authors: Aleksandr A. Gajour, Vladimir G. Merzlikin, Vladimir I. Veselov

Abstract:

This paper refers to the analysis of the characteristics of industrial and lifestyle facilities heat- energy objects as a part of the thermal envelope of Earth's surface for inclusion in any database of economic forecasting. The idealized model of the Earth's surface is discussed. This model gives the opportunity to obtain the energy equivalent for each element of terrain and world ocean. Energy efficiency criterion of comfortable human existence is introduced. Dynamics of changes of this criterion offers the possibility to simulate the possible technogenic catastrophes with the spontaneous industrial development of the certain Earth areas. Calculated model with the confirmed forecast of the Gulf Stream freezing in the polar regions in 2011 due to the heat-energy balance disturbance for the oceanic subsurface oil polluted layer is given. Two opposing trends of human development under limited and unlimited amount of heat-energy resources are analyzed.

Keywords: Earth's surface, heat-energy consumption, energy criteria, technogenic catastrophes

Procedia PDF Downloads 407
14801 Mapping Alternative Education in Italy: The Case of Popular and Second-Chance Schools and Interventions in Lombardy

Authors: Valeria Cotza

Abstract:

School drop-out is a multifactorial phenomenon that in Italy concerns all those underage students who, at different school stages (up to 16 years old) or training (up to 18 years old), manifest educational difficulties from dropping out of compulsory education without obtaining a qualification to repetition rates and absenteeism. From the 1980s to the 2000s, there was a progressive attenuation of the economic and social model towards a multifactorial reading of the phenomenon, and the European Commission noted the importance of learning about the phenomenon through approaches able to integrate large-scale quantitative surveys with qualitative analyses. It is not a matter of identifying the contextual factors affecting the phenomenon but problematising them by means of systemic and comprehensive in-depth analysis. So, a privileged point of observation and field of intervention are those schools that propose alternative models of teaching and learning to the traditional ones, such as popular and second-chance schools. Alternative schools and interventions grew in these years in Europe as well as in the US and Latin America, working in the direction of greater equity to create the conditions (often absent in conventional schools) for everyone to achieve educational goals. Against extensive Anglo-Saxon and US literature on this topic, there is yet no unambiguous definition of alternative education, especially in Europe, where second-chance education has been most studied. There is little literature on a second chance in Italy and almost none on alternative education (with the exception of method schools, to which in Italy the concept of “alternative” is linked). This research aims to fill the gap by systematically surveying the alternative interventions in the area and beginning to explore some models of popular and second-chance schools and experiences through a mixed methods approach. So, the main research objectives concern the spread of alternative education in the Lombardy region, the main characteristics of these schools and interventions, and their effectiveness in terms of students’ well-being and school results. This paper seeks to answer the first point by presenting the preliminary results of the first phase of the project dedicated to mapping. Through the Google Forms platform, a questionnaire is being distributed to all schools in Lombardy and some schools in the rest of Italy to map the presence of alternative schools and interventions and their main characteristics. The distribution is also taking place thanks to the support of the Milan Territorial and Lombardy Regional School Offices. Moreover, other social realities outside the school system (such as cooperatives and cultural associations) can be questioned. The schools and other realities to be questioned outside Lombardy will also be identified with the support of INDIRE (Istituto Nazionale per Documentazione, Innovazione e Ricerca Educativa, “National Institute for Documentation, Innovation and Educational Research”) and based on existing literature and the indicators of “Futura” Plan of the PNRR (Piano Nazionale di Ripresa e Resilienza, “National Recovery and Resilience Plan”). Mapping will be crucial and functional for the subsequent qualitative and quantitative phase, which will make use of statistical analysis and constructivist grounded theory.

Keywords: school drop-out, alternative education, popular and second-chance schools, map

Procedia PDF Downloads 92