Search results for: opting for the appropriate approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13875

Search results for: opting for the appropriate approach

11925 Studying Language of Immediacy and Language of Distance from a Corpus Linguistic Perspective: A Pilot Study of Evaluation Markers in French Television Weather Reports

Authors: Vince Liégeois

Abstract:

Language of immediacy and distance: Within their discourse theory, Koch & Oesterreicher establish a distinction between a language of immediacy and a language of distance. The former refers to those discourses which are oriented more towards a spoken norm, whereas the latter entails discourses oriented towards a written norm, regardless of whether they are realised phonically or graphically. This means that an utterance can be realised phonically but oriented more towards the written language norm (e.g., a scientific presentation or eulogy) or realised graphically but oriented towards a spoken norm (e.g., a scribble or chat messages). Research desiderata: The methodological approach from Koch & Oesterreicher has often been criticised for not providing a corpus-linguistic methodology, which makes it difficult to work with quantitative data or address large text collections within this research paradigm. Consequently, the Koch & Oesterreicher approach has difficulties gaining ground in those research areas which rely more on corpus linguistic research models, like text linguistics and LSP-research. A combinatory approach: Accordingly, we want to establish a combinatory approach with corpus-based linguistic methodology. To this end, we propose to (i) include data about the context of an utterance (e.g., monologicity/dialogicity, familiarity with the speaker) – which were called “conditions of communication” in the original work of Koch & Oesterreicher – and (ii) correlate the linguistic phenomenon at the centre of the inquiry (e.g., evaluation markers) to a group of linguistic phenomena deemed typical for either distance- or immediacy-language. Based on these two parameters, linguistic phenomena and texts could then be mapped on an immediacy-distance continuum. Pilot study: To illustrate the benefits of this approach, we will conduct a pilot study on evaluation phenomena in French television weather reports, a form of domain-sensitive discourse which has often been cited as an example of a “text genre”. Within this text genre, we will look at so-called “evaluation markers,” e.g., fixed strings like bad weather, stifling hot, and “no luck today!”. These evaluation markers help to communicate the coming weather situation towards the lay audience but have not yet been studied within the Koch & Oesterreicher research paradigm. Accordingly, we want to figure out whether said evaluation markers are more typical for those weather reports which tend more towards immediacy or those which tend more towards distance. To this aim, we collected a corpus with different kinds of television weather reports,e.g., as part of the news broadcast, including dialogue. The evaluation markers themselves will be studied according to the explained methodology, by correlating them to (i) metadata about the context and (ii) linguistic phenomena characterising immediacy-language: repetition, deixis (personal, spatial, and temporal), a freer choice of tense and right- /left-dislocation. Results: Our results indicate that evaluation markers are more dominantly present in those weather reports inclining towards immediacy-language. Based on the methodology established above, we have gained more insight into the working of evaluation markers in the domain-sensitive text genre of (television) weather reports. For future research, it will be interesting to determine whether said evaluation markers are also typical for immediacy-language-oriented in other domain-sensitive discourses.

Keywords: corpus-based linguistics, evaluation markers, language of immediacy and distance, weather reports

Procedia PDF Downloads 217
11924 An Ant Colony Optimization Approach for the Pollution Routing Problem

Authors: P. Parthiban, Sonu Rajak, N. Kannan, R. Dhanalakshmi

Abstract:

This paper deals with the Vehicle Routing Problem (VRP) with environmental considerations which is called Pollution Routing Problem (PRP). The objective is to minimize the operational and environmental costs. It consists of routing a number of vehicles to serve a set of customers, and determining fuel consumption, driver wages and their speed on each route segment, while respecting the capacity constraints and time windows. In this context, we presented an Ant Colony Optimization (ACO) approach, combined with a Speed Optimization Algorithm (SOA) to solve the PRP. The proposed solution method consists of two stages. Stage one is to solve a Vehicle Routing Problem with Time Window (VRPTW) using ACO and in the second stage a SOA is run on the resulting VRPTW solutions. Given a vehicle route, the SOA consists of finding the optimal speed on each arc of the route in order to minimize an objective function comprising fuel consumption costs and driver wages. The proposed algorithm tested on benchmark problem, the preliminary results show that the proposed algorithm is able to provide good solutions.

Keywords: ant colony optimization, CO2 emissions, combinatorial optimization, speed optimization, vehicle routing

Procedia PDF Downloads 321
11923 Fossil Health: Causes and Consequences of Hegemonic Health Paradigms

Authors: Laila Vivas

Abstract:

Fossil Health is proposed as a value-concept to describe the hegemonic health paradigms that underpin health enactment. Such representation is justified by Foucaldian and related ideas on biopower and biosocialities, calling for the politicization of health and signalling the importance of narratives. This approach, hence, enables contemplating health paradigms as reflexive or co-constitutive of health itself or, in other words, conceiving health as a verb. Fossil health is a symbolic representation, influenced by Andreas Malm’s concept of fossil capitalism, that integrates environment and health as non-dichotomic areas. Fossil Health sustains that current notions of human and non-human health revolve around fossil fuel dependencies. Moreover, addressing disequilibria from established health ideals involves fossil-fixes. Fossil Health, therefore, represents causes and consequences of a health conception that has the agency to contribute to the functioning of a particular structural eco-social model. Moreover, within current capitalist relations, Fossil Health expands its meaning to cover not only fossil implications but also other dominant paradigms of the capitalist system that are (re)produced through health paradigms, such as the burgeoning of technoscience and biomedicalization, privatization of health, expertization of health, or the imposing of standards of uniformity. Overall, Fossil Health is a comprehensive approach to environment and health, where understanding hegemonic health paradigms means understanding our (human-non-human) nature paradigms and the structuring effect these narratives convey.

Keywords: fossil health, environment, paradigm, capitalism

Procedia PDF Downloads 119
11922 Prediction of Disability-Adjustment Mental Illness Using Machine Learning

Authors: S. R. M. Krishna, R. Santosh Kumar, V. Kamakshi Prasad

Abstract:

Machine learning techniques are applied for the analysis of the impact of mental illness on the burden of disease. It is calculated using the disability-adjusted life year (DALY). DALYs for a disease is the sum of years of life lost due to premature mortality (YLLs) + No of years of healthy life lost due to disability (YLDs). The critical analysis is done based on the Data sources, machine learning techniques and feature extraction method. The reviewing is done based on major databases. The extracted data is examined using statistical analysis and machine learning techniques were applied. The prediction of the impact of mental illness on the population using machine learning techniques is an alternative approach to the old traditional strategies, which are time-consuming and may not be reliable. The approach makes it necessary for a comprehensive adoption, innovative algorithms, and an understanding of the limitations and challenges. The obtained prediction is a way of understanding the underlying impact of mental illness on the health of the people and it enables us to get a healthy life expectancy. The growing impact of mental illness and the challenges associated with the detection and treatment of mental disorders make it necessary for us to understand the complete effect of it on the majority of the population.

Keywords: ML, DAL, YLD, YLL

Procedia PDF Downloads 34
11921 Second Order Cone Optimization Approach to Two-stage Network DEA

Authors: K. Asanimoghadam, M. Salahi, A. Jamalian

Abstract:

Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.

Keywords: network DEA, conic optimization, undesirable output, SBM

Procedia PDF Downloads 193
11920 A Weighted Group EI Incorporating Role Information for More Representative Group EI Measurement

Authors: Siyu Wang, Anthony Ward

Abstract:

Emotional intelligence (EI) is a well-established personal characteristic. It has been viewed as a critical factor which can influence an individual's academic achievement, ability to work and potential to succeed. When working in a group, EI is fundamentally connected to the group members' interaction and ability to work as a team. The ability of a group member to intelligently perceive and understand own emotions (Intrapersonal EI), to intelligently perceive and understand other members' emotions (Interpersonal EI), and to intelligently perceive and understand emotions between different groups (Cross-boundary EI) can be considered as Group emotional intelligence (Group EI). In this research, a more representative Group EI measurement approach, which incorporates the information of the composition of a group and an individual’s role in that group, is proposed. To demonstrate the claim of being more representative Group EI measurement approach, this study adopts a multi-method research design, involving a combination of both qualitative and quantitative techniques to establish a metric of Group EI. From the results, it can be concluded that by introducing the weight coefficient of each group member on group work into the measurement of Group EI, Group EI will be more representative and more capable of understanding what happens during teamwork than previous approaches.

Keywords: case study, emotional intelligence, group EI, multi-method research

Procedia PDF Downloads 121
11919 An Interpretive Study of Entrepreneurial Experience towards Achieving Business Growth Using the Theory of Planned Behaviour as a Lens

Authors: Akunna Agunwah, Kevin Gallimore, Kathryn Kinmond

Abstract:

Entrepreneurship is widely associated and seen as a vehicle for economic growth; however, different scholars have studied entrepreneurship from various perspectives, resulting in multiple definitions. It is surprising to know most entrepreneurship definition does not incorporate growth as part of their definition of entrepreneurship. Economic growth is engineered by the activities of the entrepreneurs. The purpose of the present theoretical study is to explore the working practices of the successful entrepreneurs towards achieving business growth by understanding the experiences of the entrepreneur using the Theory of Planned Behaviour (TPB) as a lens. Ten successful entrepreneurs in the North West of England in various business sectors were interviewed using semi-structured interview method. The recorded audio interviews transcribed and subsequently evaluated using the thematic deductive technique (qualitative approach). The themes were examined using Theory of Planned Behaviour to ascertain the presence of the three intentional antecedents (attitude, subjective norms, and perceived behavioural control). The findings categorised in two folds, firstly, it was observed that the three intentional antecedents, which make up Theory of Planned Behaviour were evident in the transcript. Secondly, the entrepreneurs are most concerned with achieving a state of freedom and realising their visions and ambitions. Nevertheless, the entrepreneur employed these intentional antecedents to enhance business growth. In conclusion, the work presented here showed a novel way of understanding the working practices and experiences of the entrepreneur using the theory of planned behaviour in qualitative approach towards enhancing business growth. There exist few qualitative studies in entrepreneurship research. In addition, this work applies a novel approach to studying the experience of the entrepreneurs by examining the working practices of the successful entrepreneurs in the North-West England through the lens of the theory of planned behaviour. Given the findings regarding TPB as a lens in the study, the entrepreneur does not differentiate between the categories of the antecedents reasonably sees them as processes that can be utilised to enhance business growth.

Keywords: business growth, experience, interpretive, theory of planned behaviour

Procedia PDF Downloads 213
11918 Lean Thinking and E-Commerce as New Opportunities to Improve Partnership in Supply Chain of Construction Industries

Authors: Kaustav Kundu, Alberto Portioli Staudacher

Abstract:

Construction industry plays a vital role in the economy of the world. But due to high uncertainty and variability in the industry, its performance is not as efficient in terms of quality, lead times, productivity and costs as of other industries. Moreover, there are continuous conflicts among the different actors in the construction supply chains in terms of profit sharing. Previous studies suggested partnership as an important approach to promote cooperation among the different actors in the construction supply chains and thereby it improves the overall performance. Construction practitioners tried to focus on partnership which can enhance the performance of construction supply chains but they are not fully aware of different approaches and techniques for improving partnership. In this research, a systematic review on partnership in relation to construction supply chains is carried out to understand different elements influencing the partnership. The research development of this domain is analyzed by reviewing selected articles published from 1996 to 2015. Based on the papers, three major elements influencing partnership in construction supply chains are identified: “Lean approach”, “Relationship building” and “E-commerce applications”. This study analyses the contributions in the areas within each element and provides suggestions for future developments of partnership in construction supply chains.

Keywords: partnership, construction, lean, SCM, supply chain management

Procedia PDF Downloads 433
11917 The Reality of Engineering Education in the Kingdom of Saudi Arabia and Its Suitainability to The Requirements of The Labor Market

Authors: Hamad Albadr

Abstract:

With the development that has occurred in the orientation of universities from liability cognitive and maintain the culture of the community to responsibility job formation graduates to work according to the needs of the community development; representing universities in today's world, the prime motivator for the wheel of development in the community and find appropriate solutions to the problems they are facing and adapt to the demands of the changing environment. In this paper review of the reality of engineering education in the Kingdom of Saudi Arabia and its suitability to the requirements of the labor market, where they will be looking at the university as a system administrator educational using System Analysis Approach as one of the methods of modern management to analyze the performance of organizations and institutions, administrative and quality assessment. According to this approach is to deal with the system as a set of subsystems as components of the main divided into : input, process, and outputs, and the surrounding environment, will also be used research descriptive method and analytical , to gather information, data and analysis answers of the study population that consisting of a random sample of the beneficiaries of these services that the universities provided that about 500 professionals about employment in the business sector.

Keywords: universities in Saudi Arabia, engineering education, labor market, administrative, quality assessment

Procedia PDF Downloads 339
11916 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows

Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono

Abstract:

A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.

Keywords: LES, multi-resolution, ENO, fortran

Procedia PDF Downloads 364
11915 Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal

Authors: Belayneh Matebie, Michael Melese

Abstract:

The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms.

Keywords: Teff, ensemble learning, max-voting, CNN, SVM, RF

Procedia PDF Downloads 52
11914 Modelling Causal Effects from Complex Longitudinal Data via Point Effects of Treatments

Authors: Xiaoqin Wang, Li Yin

Abstract:

Background and purpose: In many practices, one estimates causal effects arising from a complex stochastic process, where a sequence of treatments are assigned to influence a certain outcome of interest, and there exist time-dependent covariates between treatments. When covariates are plentiful and/or continuous, statistical modeling is needed to reduce the huge dimensionality of the problem and allow for the estimation of causal effects. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to conduct the modeling via point effects. The purpose of the work is to study the modeling of these causal effects via point effects. Challenges and solutions: The time-dependent covariates often have influences from earlier treatments as well as on subsequent treatments. Consequently, the standard parameters – i.e., the mean of the outcome given all treatments and covariates-- are essentially all different (null paradox). Furthermore, the dimension of the parameters is huge (curse of dimensionality). Therefore, it can be difficult to conduct the modeling in terms of standard parameters. Instead of standard parameters, we have use point effects of treatments to develop likelihood-based parametric approach to the modeling of these causal effects and are able to model the causal effects of a sequence of treatments by modeling a small number of point effects of individual treatment Achievements: We are able to conduct the modeling of the causal effects from a sequence of treatments in the familiar framework of single-point causal inference. The simulation shows that our method achieves not only an unbiased estimate for the causal effect but also the nominal level of type I error and a low level of type II error for the hypothesis testing. We have applied this method to a longitudinal study of COVID-19 mortality among Scandinavian countries and found that the Swedish approach performed far worse than the other countries' approach for COVID-19 mortality and the poor performance was largely due to its early measure during the initial period of the pandemic.

Keywords: causal effect, point effect, statistical modelling, sequential causal inference

Procedia PDF Downloads 205
11913 Integrated Approach of Quality Function Deployment, Sensitivity Analysis and Multi-Objective Linear Programming for Business and Supply Chain Programs Selection

Authors: T. T. Tham

Abstract:

The aim of this study is to propose an integrated approach to determine the most suitable programs, based on Quality Function Deployment (QFD), Sensitivity Analysis (SA) and Multi-Objective Linear Programming model (MOLP). Firstly, QFD is used to determine business requirements and transform them into business and supply chain programs. From the QFD, technical scores of all programs are obtained. All programs are then evaluated through five criteria (productivity, quality, cost, technical score, and feasibility). Sets of weight of these criteria are built using Sensitivity Analysis. Multi-Objective Linear Programming model is applied to select suitable programs according to multiple conflicting objectives under a budget constraint. A case study from the Sai Gon-Mien Tay Beer Company is given to illustrate the proposed methodology. The outcome of the study provides a comprehensive picture for companies to select suitable programs to obtain the optimal solution according to their preference.

Keywords: business program, multi-objective linear programming model, quality function deployment, sensitivity analysis, supply chain management

Procedia PDF Downloads 122
11912 A TFETI Domain Decompositon Solver for von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening

Authors: Martin Cermak, Stanislav Sysala

Abstract:

In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MATLAB.

Keywords: isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution

Procedia PDF Downloads 419
11911 Coupling Static Multiple Light Scattering Technique With the Hansen Approach to Optimize Dispersibility and Stability of Particle Dispersions

Authors: Guillaume Lemahieu, Matthias Sentis, Giovanni Brambilla, Gérard Meunier

Abstract:

Static Multiple Light Scattering (SMLS) has been shown to be a straightforward technique for the characterization of colloidal dispersions without dilution, as multiply scattered light in backscattered and transmitted mode is directly related to the concentration and size of scatterers present in the sample. In this view, the use of SMLS for stability measurement of various dispersion types has already been widely described in the literature. Indeed, starting from a homogeneous dispersion, the variation of backscattered or transmitted light can be attributed to destabilization phenomena, such as migration (sedimentation, creaming) or particle size variation (flocculation, aggregation). In a view to investigating more on the dispersibility of colloidal suspensions, an experimental set-up for “at the line” SMLS experiment has been developed to understand the impact of the formulation parameters on particle size and dispersibility. The SMLS experiment is performed with a high acquisition rate (up to 10 measurements per second), without dilution, and under direct agitation. Using such experimental device, SMLS detection can be combined with the Hansen approach to optimize the dispersing and stabilizing properties of TiO₂ particles. It appears that the dispersibility and the stability spheres generated are clearly separated, arguing that lower stability is not necessarily a consequence of poor dispersibility. Beyond this clarification, this combined SMLS-Hansen approach is a major step toward the optimization of dispersibility and stability of colloidal formulations by finding solvents having the best compromise between dispersing and stabilizing properties. Such study can be intended to find better dispersion media, greener and cheaper solvents to optimize particles suspensions, reduce the content of costly stabilizing additives or satisfy product regulatory requirements evolution in various industrial fields using suspensions (paints & inks, coatings, cosmetics, energy).

Keywords: dispersibility, stability, Hansen parameters, particles, solvents

Procedia PDF Downloads 107
11910 A Dynamical Approach for Relating Energy Consumption to Hybrid Inventory Level in the Supply Chain

Authors: Benga Ebouele, Thomas Tengen

Abstract:

Due to long lead time, work in process (WIP) inventory can manifest within the supply chain of most manufacturing system. It implies that there are lesser finished good on hand and more in the process because the work remains in the factory too long and cannot be sold to either customers The supply chain of most manufacturing system is then considered as inefficient as it take so much time to produce the finished good. Time consumed in each operation of the supply chain has an associated energy costs. Such phenomena can be harmful for a hybrid inventory system because a lot of space to store these semi-finished goods may be needed and one is not sure about the final energy cost of producing, holding and delivering the good to customers. The principle that reduces waste of energy within the supply chain of most manufacturing firms should therefore be available to all inventory managers in pursuit of profitability. Decision making by inventory managers in this condition is a modeling process, whereby a dynamical approach is used to depict, examine, specify and even operationalize the relationship between energy consumption and hybrid inventory level. The relationship between energy consumption and inventory level is established, which indicates a poor level of control and hence a potential for energy savings.

Keywords: dynamic modelling, energy used, hybrid inventory, supply chain

Procedia PDF Downloads 265
11909 Active Contours for Image Segmentation Based on Complex Domain Approach

Authors: Sajid Hussain

Abstract:

The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.

Keywords: image segmentation, active contour, level set, Mumford and Shah model

Procedia PDF Downloads 112
11908 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard

Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane

Abstract:

This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.

Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard

Procedia PDF Downloads 295
11907 Use of Multistage Transition Regression Models for Credit Card Income Prediction

Authors: Denys Osipenko, Jonathan Crook

Abstract:

Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.

Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability

Procedia PDF Downloads 484
11906 Mid-Temperature Methane-Based Chemical Looping Reforming for Hydrogen Production via Iron-Based Oxygen Carrier Particles

Authors: Yang Li, Mingkai Liu, Qiong Rao, Zhongrui Gai, Ying Pan, Hongguang Jin

Abstract:

Hydrogen is an ideal and potential energy carrier due to its high energy efficiency and low pollution. An alternative and promising approach to hydrogen generation is the chemical looping steam reforming of methane (CL-SRM) over iron-based oxygen carriers. However, the process faces challenges such as high reaction temperature (>850 ℃) and low methane conversion. We demonstrate that Ni-mixed Fe-based oxygen carrier particles have significantly improved the methane conversion and hydrogen production rate in the range of 450-600 ℃ under atmospheric pressure. The effect on the reaction reactivity of oxygen carrier particles mixed with different Ni-based particle mass ratios has been determined in the continuous unit. More than 85% of methane conversion has been achieved at 600 ℃, and hydrogen can be produced in both reduction and oxidation steps. Moreover, the iron-based oxygen carrier particles exhibited good cyclic performance during 150 consecutive redox cycles at 600 ℃. The mid-temperature iron-based oxygen carrier particles, integrated with a moving-bed chemical looping system, might provide a powerful approach toward more efficient and scalable hydrogen production.

Keywords: chemical looping, hydrogen production, mid-temperature, oxygen carrier particles

Procedia PDF Downloads 138
11905 Enhancing the Effectiveness of Air Defense Systems through Simulation Analysis

Authors: F. Felipe

Abstract:

Air Defense Systems contain high-value assets that are expected to fulfill their mission for several years - in many cases, even decades - while operating in a fast-changing, technology-driven environment. Thus, it is paramount that decision-makers can assess how effective an Air Defense System is in the face of new developing threats, as well as to identify the bottlenecks that could jeopardize the security of the airspace of a country. Given the broad extent of activities and the great variety of assets necessary to achieve the strategic objectives, a systems approach was taken in order to delineate the core requirements and the physical architecture of an Air Defense System. Then, value-focused thinking helped in the definition of the measures of effectiveness. Furthermore, analytical methods were applied to create a formal structure that preliminarily assesses such measures. To validate the proposed methodology, a powerful simulation was also used to determine the measures of effectiveness, now in more complex environments that incorporate both uncertainty and multiple interactions of the entities. The results regarding the validity of this methodology suggest that the approach can support decisions aimed at enhancing the capabilities of Air Defense Systems. In conclusion, this paper sheds some light on how consolidated approaches of Systems Engineering and Operations Research can be used as valid techniques for solving problems regarding a complex and yet vital matter.

Keywords: air defense, effectiveness, system, simulation, decision-support

Procedia PDF Downloads 155
11904 System Identification of Building Structures with Continuous Modeling

Authors: Ruichong Zhang, Fadi Sawaged, Lotfi Gargab

Abstract:

This paper introduces a wave-based approach for system identification of high-rise building structures with a pair of seismic recordings, which can be used to evaluate structural integrity and detect damage in post-earthquake structural condition assessment. The fundamental of the approach is based on wave features of generalized impulse and frequency response functions (GIRF and GFRF), i.e., wave responses at one structural location to an impulsive motion at another reference location in time and frequency domains respectively. With a pair of seismic recordings at the two locations, GFRF is obtainable as Fourier spectral ratio of the two recordings, and GIRF is then found with the inverse Fourier transformation of GFRF. With an appropriate continuous model for the structure, a closed-form solution of GFRF, and subsequent GIRF, can also be found in terms of wave transmission and reflection coefficients, which are related to structural physical properties above the impulse location. Matching the two sets of GFRF and/or GIRF from recordings and the model helps identify structural parameters such as wave velocity or shear modulus. For illustration, this study examines ten-story Millikan Library in Pasadena, California with recordings of Yorba Linda earthquake of September 3, 2002. The building is modelled as piecewise continuous layers, with which GFRF is derived as function of such building parameters as impedance, cross-sectional area, and damping. GIRF can then be found in closed form for some special cases and numerically in general. Not only does this study reveal the influential factors of building parameters in wave features of GIRF and GRFR, it also shows some system-identification results, which are consistent with other vibration- and wave-based results. Finally, this paper discusses the effectiveness of the proposed model in system identification.

Keywords: wave-based approach, seismic responses of buildings, wave propagation in structures, construction

Procedia PDF Downloads 232
11903 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 524
11902 Top-Down, Middle-Out, Bottom-Up: A Design Approach to Transforming Prison

Authors: Roland F. Karthaus, Rachel S. O'Brien

Abstract:

Over the past decade, the authors have undertaken applied research aimed at enabling transformation within the prison service to improve conditions and outcomes for those living, working and visiting in prisons in the UK and the communities they serve. The research has taken place against a context of reducing resources and public discontent at increasing levels of violence, deteriorating conditions and persistently high levels of re-offending. Top-down governmental policies have mainly been ineffectual and in some cases counter-productive. The prison service is characterised by hierarchical organisation, and the research has applied design thinking at multiple levels to challenge and precipitate change: top-down, middle-out and bottom-up. The research employs three distinct but related approaches, system design (top-down): working at the national policy level to analyse the changing policy context, identifying opportunities and challenges; engaging with the Ministry of Justice commissioners and sector organisations to facilitate debate, introducing new evidence and provoking creative thinking, place-based design (middle-out): working with individual prison establishments as pilots to illustrate and test the potential for local empowerment, creative change, and improved architecture within place-specific contexts and organisational hierarchies, everyday design (bottom-up): working with individuals in the system to explore the potential for localised, significant, demonstrator changes; including collaborative design, capacity building and empowerment in skills, employment, communication, training, and other activities. The research spans a series of projects, through which the methodological approach has developed responsively. The projects include a place-based model for the re-purposing of Ministry of Justice land assets for the purposes of rehabilitation; an evidence-based guide to improve prison design for health and well-being; capacity-based employment, skills and self-build project as a template for future open prisons. The overarching research has enabled knowledge to be developed and disseminated through policy and academic networks. Whilst the research remains live and continuing; key findings are emerging as a basis for a new methodological approach to effecting change in the UK prison service. An interdisciplinary approach is necessary to overcome the barriers between distinct areas of the prison service. Sometimes referred to as total environments, prisons encompass entire social and physical environments which themselves are orchestrated by institutional arms of government, resulting in complex systems that cannot be meaningfully engaged through narrow disciplinary lenses. A scalar approach is necessary to connect strategic policies with individual experiences and potential, through the medium of individual prison establishments, operating as discrete entities within the system. A reflexive process is necessary to connect research with action in a responsive mode, learning to adapt as the system itself is changing. The role of individuals in the system, their latent knowledge and experience and their ability to engage and become agents of change are essential. Whilst the specific characteristics of the UK prison system are unique, the approach is internationally applicable.

Keywords: architecture, design, policy, prison, system, transformation

Procedia PDF Downloads 133
11901 Sustainable Mitigation of Urban Stormwater Runoff: The Applicability of Green Infrastructure Approach in Finnish Climate

Authors: Rima Almalla

Abstract:

The purpose of the research project in Geography is to evaluate the applicability of urban green infrastructure approach in Finnish climate. The key focus will be on the operation and efficiency of green infrastructure on urban stormwater management. Green infrastructure approach refers to the employment of sufficient green covers as a modern and smart environmental solution to improve the quality of urban environments. Green infrastructure provides a wide variety of micro-scale ecosystem services, such as stormwater runoff management, regulation of extreme air temperatures, reduction of energy consumption, plus a variety of social benefits and human health and wellbeing. However, the cold climate of Finland with seasonal ground frost, snow cover and relatively short growing season bring about questions of whether green infrastructure works as efficiently as expected. To tackle this question, green infrastructure solutions will be studied and analyzed with manifold methods: stakeholder perspectives regarding existing and planned GI solutions will be collected by web based questionnaires, semi structured interviews and group discussions, and analyzed in both qualitative and quantitative methods. Targeted empirical field campaigns will be conducted on selected sites. A systematic literature review with global perspective will support the analyses. The findings will be collected, compiled and analyzed using geographic information systems (GIS). The findings of the research will improve our understanding of the functioning of green infrastructure in the Finnish environment in urban stormwater management, as a landscape element for citizens’ wellbeing, and in climate change mitigation and adaptation. The acquired information will be shared with stakeholders in interactive co-design workshops. As green covers have great demand and potential globally, the conclusions will have relevance in other cool climate regions and may support Finnish business in green infrastructure sector.

Keywords: climate change adaptation, climate change, green infrastructure, stormwater

Procedia PDF Downloads 165
11900 Creating Entrepreneurial Universities: The Swedish Approach of Transformation

Authors: Fawaz Saad, Hamid Alalwany

Abstract:

Sweden has succeeded to maintain a high level of growth and development and has managed to sustain highly ranked position among the world’s developed countries. In this regard, Swedish universities are playing a vital role in supporting innovation and entrepreneurship at all levels and developing Swedish knowledge economy. This paper is aiming to draw on the experiences of two leading Swedish universities, addressing their transformation approach to create entrepreneurial universities and fulfilling their objectives in the era of knowledge economy. The objectives of the paper include: (1) Introducing the Swedish higher education and its characteristics. (2) Examining the infrastructure elements for innovation and Entrepreneurship at two of the Swedish entrepre-neurial universities. (3) Addressing the key aspects of support systems in the initiatives of both Chalmers and Gothenburg universities to support innovation and advance entrepreneurial practices. The paper will contribute to two discourses: (1) Examining the relationship between support systems for innovation and entrepreneurship and the Universities’ policies and practices. (2) Lessons for University leaders to assist the development and implementation of effective innovation and en-trepreneurship policies and practices.

Keywords: Entrepreneurial University, Chalmers University, Gothenburg University, innovation and entrepreneurship policies, entrepreneurial transformation

Procedia PDF Downloads 505
11899 Spatial Patterns and Temporal Evolution of Octopus Abundance in the Mauritanian Zone

Authors: Dedah Ahmed Babou, Nicolas Bez

Abstract:

The Min-Max autocorrelation factor (MAF) approach makes it possible to express in a space formed by spatially independent factors, spatiotemporal observations. These factors are ordered in decreasing order of spatial autocorrelation. The starting observations are thus expressed in the space formed by these factors according to temporal coordinates. Each vector of temporal coefficients expresses the temporal evolution of the weight of the corresponding factor. Applying this approach has enabled us to achieve the following results: (i) Define a spatially orthogonal space in which the projections of the raw data are determined; (ii) Define a limit threshold for the factors with the strongest structures in order to analyze the weight, and the temporal evolution of these different structures (iii) Study the correlation between the temporal evolution of the persistent spatial structures and that of the observed average abundance (iv) Propose prototypes of campaigns reflecting a high vs. low abundance (v) Propose a classification of campaigns that highlights seasonal and/or temporal similarities. These results were obtained by analyzing the octopus yield during the scientific campaigns of the oceanographic vessel Al Awam during the period 1989-2017 in the Mauritanian exclusive economic zone.

Keywords: spatiotemporal , autocorrelation, kriging, variogram, Octopus vulgaris

Procedia PDF Downloads 145
11898 3D Printing for Maritime Cultural Heritage: A Design for All Approach to Public Interpretation

Authors: Anne Eugenia Wright

Abstract:

This study examines issues in accessibility to maritime cultural heritage. Using the Pillar Dollar Wreck in Biscayne National Park, Florida, this study presents an approach to public outreach based on the concept of Design for All. Design for All advocates creating products that are accessible and functional for all users, including those with visual, hearing, learning, mobility, or economic impairments. As a part of this study, a small exhibit was created that uses 3D products as a way to bring maritime cultural heritage to the public. It was presented to the public at East Carolina University’s Joyner Library. Additionally, this study presents a methodology for 3D printing scaled photogrammetry models of archaeological sites in full color. This methodology can be used to present a realistic depiction of underwater archaeological sites to those who are incapable of accessing them in the water. Additionally, this methodology can be used to present underwater archaeological sites that are inaccessible to the public due to conditions such as visibility, depth, or protected status. This study presents a practical use for 3D photogrammetry models, as well as an accessibility strategy to expand the outreach potential for maritime archaeology.

Keywords: Underwater Archaeology, 3D Printing, Photogrammetry, Design for All

Procedia PDF Downloads 138
11897 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis

Authors: Tawfik Thelaidjia, Salah Chenikher

Abstract:

Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approach

Keywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement

Procedia PDF Downloads 436
11896 The Reduction of CO2 Emissions Level in Malaysian Transportation Sector: An Optimization Approach

Authors: Siti Indati Mustapa, Hussain Ali Bekhet

Abstract:

Transportation sector represents more than 40% of total energy consumption in Malaysia. This sector is a major user of fossils based fuels, and it is increasingly being highlighted as the sector which contributes least to CO2 emission reduction targets. Considering this fact, this paper attempts to investigate the problem of reducing CO2 emission using linear programming approach. An optimization model which is used to investigate the optimal level of CO2 emission reduction in the road transport sector is presented. In this paper, scenarios have been used to demonstrate the emission reduction model: (1) utilising alternative fuel scenario, (2) improving fuel efficiency scenario, (3) removing fuel subsidy scenario, (4) reducing demand travel, (5) optimal scenario. This study finds that fuel balancing can contribute to the reduction of the amount of CO2 emission by up to 3%. Beyond 3% emission reductions, more stringent measures that include fuel switching, fuel efficiency improvement, demand travel reduction and combination of mitigation measures have to be employed. The model revealed that the CO2 emission reduction in the road transportation can be reduced by 38.3% in the optimal scenario.

Keywords: CO2 emission, fuel consumption, optimization, linear programming, transportation sector, Malaysia

Procedia PDF Downloads 421