Search results for: disruptive approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13997

Search results for: disruptive approach

11927 The Reality of Engineering Education in the Kingdom of Saudi Arabia and Its Suitainability to The Requirements of The Labor Market

Authors: Hamad Albadr

Abstract:

With the development that has occurred in the orientation of universities from liability cognitive and maintain the culture of the community to responsibility job formation graduates to work according to the needs of the community development; representing universities in today's world, the prime motivator for the wheel of development in the community and find appropriate solutions to the problems they are facing and adapt to the demands of the changing environment. In this paper review of the reality of engineering education in the Kingdom of Saudi Arabia and its suitability to the requirements of the labor market, where they will be looking at the university as a system administrator educational using System Analysis Approach as one of the methods of modern management to analyze the performance of organizations and institutions, administrative and quality assessment. According to this approach is to deal with the system as a set of subsystems as components of the main divided into : input, process, and outputs, and the surrounding environment, will also be used research descriptive method and analytical , to gather information, data and analysis answers of the study population that consisting of a random sample of the beneficiaries of these services that the universities provided that about 500 professionals about employment in the business sector.

Keywords: universities in Saudi Arabia, engineering education, labor market, administrative, quality assessment

Procedia PDF Downloads 341
11926 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows

Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono

Abstract:

A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.

Keywords: LES, multi-resolution, ENO, fortran

Procedia PDF Downloads 366
11925 Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal

Authors: Belayneh Matebie, Michael Melese

Abstract:

The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms.

Keywords: Teff, ensemble learning, max-voting, CNN, SVM, RF

Procedia PDF Downloads 54
11924 Modelling Causal Effects from Complex Longitudinal Data via Point Effects of Treatments

Authors: Xiaoqin Wang, Li Yin

Abstract:

Background and purpose: In many practices, one estimates causal effects arising from a complex stochastic process, where a sequence of treatments are assigned to influence a certain outcome of interest, and there exist time-dependent covariates between treatments. When covariates are plentiful and/or continuous, statistical modeling is needed to reduce the huge dimensionality of the problem and allow for the estimation of causal effects. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to conduct the modeling via point effects. The purpose of the work is to study the modeling of these causal effects via point effects. Challenges and solutions: The time-dependent covariates often have influences from earlier treatments as well as on subsequent treatments. Consequently, the standard parameters – i.e., the mean of the outcome given all treatments and covariates-- are essentially all different (null paradox). Furthermore, the dimension of the parameters is huge (curse of dimensionality). Therefore, it can be difficult to conduct the modeling in terms of standard parameters. Instead of standard parameters, we have use point effects of treatments to develop likelihood-based parametric approach to the modeling of these causal effects and are able to model the causal effects of a sequence of treatments by modeling a small number of point effects of individual treatment Achievements: We are able to conduct the modeling of the causal effects from a sequence of treatments in the familiar framework of single-point causal inference. The simulation shows that our method achieves not only an unbiased estimate for the causal effect but also the nominal level of type I error and a low level of type II error for the hypothesis testing. We have applied this method to a longitudinal study of COVID-19 mortality among Scandinavian countries and found that the Swedish approach performed far worse than the other countries' approach for COVID-19 mortality and the poor performance was largely due to its early measure during the initial period of the pandemic.

Keywords: causal effect, point effect, statistical modelling, sequential causal inference

Procedia PDF Downloads 205
11923 Artificial Intelligence Impact on the Australian Government Public Sector

Authors: Jessica Ho

Abstract:

AI has helped government, businesses and industries transform the way they do things. AI is used in automating tasks to improve decision-making and efficiency. AI is embedded in sensors and used in automation to help save time and eliminate human errors in repetitive tasks. Today, we saw the growth in AI using the collection of vast amounts of data to forecast with greater accuracy, inform decision-making, adapt to changing market conditions and offer more personalised service based on consumer habits and preferences. Government around the world share the opportunity to leverage these disruptive technologies to improve productivity while reducing costs. In addition, these intelligent solutions can also help streamline government processes to deliver more seamless and intuitive user experiences for employees and citizens. This is a critical challenge for NSW Government as we are unable to determine the risk that is brought by the unprecedented pace of adoption of AI solutions in government. Government agencies must ensure that their use of AI complies with relevant laws and regulatory requirements, including those related to data privacy and security. Furthermore, there will always be ethical concerns surrounding the use of AI, such as the potential for bias, intellectual property rights and its impact on job security. Within NSW’s public sector, agencies are already testing AI for crowd control, infrastructure management, fraud compliance, public safety, transport, and police surveillance. Citizens are also attracted to the ease of use and accessibility of AI solutions without requiring specialised technical skills. This increased accessibility also comes with balancing a higher risk and exposure to the health and safety of citizens. On the other side, public agencies struggle with keeping up with this pace while minimising risks, but the low entry cost and open-source nature of generative AI led to a rapid increase in the development of AI powered apps organically – “There is an AI for That” in Government. Other challenges include the fact that there appeared to be no legislative provisions that expressly authorise the NSW Government to use an AI to make decision. On the global stage, there were too many actors in the regulatory space, and a sovereign response is needed to minimise multiplicity and regulatory burden. Therefore, traditional corporate risk and governance framework and regulation and legislation frameworks will need to be evaluated for AI unique challenges due to their rapidly evolving nature, ethical considerations, and heightened regulatory scrutiny impacting the safety of consumers and increased risks for Government. Creating an effective, efficient NSW Government’s governance regime, adapted to the range of different approaches to the applications of AI, is not a mere matter of overcoming technical challenges. Technologies have a wide range of social effects on our surroundings and behaviours. There is compelling evidence to show that Australia's sustained social and economic advancement depends on AI's ability to spur economic growth, boost productivity, and address a wide range of societal and political issues. AI may also inflict significant damage. If such harm is not addressed, the public's confidence in this kind of innovation will be weakened. This paper suggests several AI regulatory approaches for consideration that is forward-looking and agile while simultaneously fostering innovation and human rights. The anticipated outcome is to ensure that NSW Government matches the rising levels of innovation in AI technologies with the appropriate and balanced innovation in AI governance.

Keywords: artificial inteligence, machine learning, rules, governance, government

Procedia PDF Downloads 70
11922 Integrated Approach of Quality Function Deployment, Sensitivity Analysis and Multi-Objective Linear Programming for Business and Supply Chain Programs Selection

Authors: T. T. Tham

Abstract:

The aim of this study is to propose an integrated approach to determine the most suitable programs, based on Quality Function Deployment (QFD), Sensitivity Analysis (SA) and Multi-Objective Linear Programming model (MOLP). Firstly, QFD is used to determine business requirements and transform them into business and supply chain programs. From the QFD, technical scores of all programs are obtained. All programs are then evaluated through five criteria (productivity, quality, cost, technical score, and feasibility). Sets of weight of these criteria are built using Sensitivity Analysis. Multi-Objective Linear Programming model is applied to select suitable programs according to multiple conflicting objectives under a budget constraint. A case study from the Sai Gon-Mien Tay Beer Company is given to illustrate the proposed methodology. The outcome of the study provides a comprehensive picture for companies to select suitable programs to obtain the optimal solution according to their preference.

Keywords: business program, multi-objective linear programming model, quality function deployment, sensitivity analysis, supply chain management

Procedia PDF Downloads 123
11921 A TFETI Domain Decompositon Solver for von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening

Authors: Martin Cermak, Stanislav Sysala

Abstract:

In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MATLAB.

Keywords: isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution

Procedia PDF Downloads 420
11920 Coupling Static Multiple Light Scattering Technique With the Hansen Approach to Optimize Dispersibility and Stability of Particle Dispersions

Authors: Guillaume Lemahieu, Matthias Sentis, Giovanni Brambilla, Gérard Meunier

Abstract:

Static Multiple Light Scattering (SMLS) has been shown to be a straightforward technique for the characterization of colloidal dispersions without dilution, as multiply scattered light in backscattered and transmitted mode is directly related to the concentration and size of scatterers present in the sample. In this view, the use of SMLS for stability measurement of various dispersion types has already been widely described in the literature. Indeed, starting from a homogeneous dispersion, the variation of backscattered or transmitted light can be attributed to destabilization phenomena, such as migration (sedimentation, creaming) or particle size variation (flocculation, aggregation). In a view to investigating more on the dispersibility of colloidal suspensions, an experimental set-up for “at the line” SMLS experiment has been developed to understand the impact of the formulation parameters on particle size and dispersibility. The SMLS experiment is performed with a high acquisition rate (up to 10 measurements per second), without dilution, and under direct agitation. Using such experimental device, SMLS detection can be combined with the Hansen approach to optimize the dispersing and stabilizing properties of TiO₂ particles. It appears that the dispersibility and the stability spheres generated are clearly separated, arguing that lower stability is not necessarily a consequence of poor dispersibility. Beyond this clarification, this combined SMLS-Hansen approach is a major step toward the optimization of dispersibility and stability of colloidal formulations by finding solvents having the best compromise between dispersing and stabilizing properties. Such study can be intended to find better dispersion media, greener and cheaper solvents to optimize particles suspensions, reduce the content of costly stabilizing additives or satisfy product regulatory requirements evolution in various industrial fields using suspensions (paints & inks, coatings, cosmetics, energy).

Keywords: dispersibility, stability, Hansen parameters, particles, solvents

Procedia PDF Downloads 110
11919 A Dynamical Approach for Relating Energy Consumption to Hybrid Inventory Level in the Supply Chain

Authors: Benga Ebouele, Thomas Tengen

Abstract:

Due to long lead time, work in process (WIP) inventory can manifest within the supply chain of most manufacturing system. It implies that there are lesser finished good on hand and more in the process because the work remains in the factory too long and cannot be sold to either customers The supply chain of most manufacturing system is then considered as inefficient as it take so much time to produce the finished good. Time consumed in each operation of the supply chain has an associated energy costs. Such phenomena can be harmful for a hybrid inventory system because a lot of space to store these semi-finished goods may be needed and one is not sure about the final energy cost of producing, holding and delivering the good to customers. The principle that reduces waste of energy within the supply chain of most manufacturing firms should therefore be available to all inventory managers in pursuit of profitability. Decision making by inventory managers in this condition is a modeling process, whereby a dynamical approach is used to depict, examine, specify and even operationalize the relationship between energy consumption and hybrid inventory level. The relationship between energy consumption and inventory level is established, which indicates a poor level of control and hence a potential for energy savings.

Keywords: dynamic modelling, energy used, hybrid inventory, supply chain

Procedia PDF Downloads 268
11918 Active Contours for Image Segmentation Based on Complex Domain Approach

Authors: Sajid Hussain

Abstract:

The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.

Keywords: image segmentation, active contour, level set, Mumford and Shah model

Procedia PDF Downloads 114
11917 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard

Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane

Abstract:

This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.

Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard

Procedia PDF Downloads 297
11916 Use of Multistage Transition Regression Models for Credit Card Income Prediction

Authors: Denys Osipenko, Jonathan Crook

Abstract:

Because of the variety of the card holders’ behaviour types and income sources each consumer account can be transferred to a variety of states. Each consumer account can be inactive, transactor, revolver, delinquent, defaulted and requires an individual model for the income prediction. The estimation of transition probabilities between statuses at the account level helps to avoid the memorylessness of the Markov Chains approach. This paper investigates the transition probabilities estimation approaches to credit cards income prediction at the account level. The key question of empirical research is which approach gives more accurate results: multinomial logistic regression or multistage conditional logistic regression with binary target. Both models have shown moderate predictive power. Prediction accuracy for conditional logistic regression depends on the order of stages for the conditional binary logistic regression. On the other hand, multinomial logistic regression is easier for usage and gives integrate estimations for all states without priorities. Thus further investigations can be concentrated on alternative modeling approaches such as discrete choice models.

Keywords: multinomial regression, conditional logistic regression, credit account state, transition probability

Procedia PDF Downloads 487
11915 Mid-Temperature Methane-Based Chemical Looping Reforming for Hydrogen Production via Iron-Based Oxygen Carrier Particles

Authors: Yang Li, Mingkai Liu, Qiong Rao, Zhongrui Gai, Ying Pan, Hongguang Jin

Abstract:

Hydrogen is an ideal and potential energy carrier due to its high energy efficiency and low pollution. An alternative and promising approach to hydrogen generation is the chemical looping steam reforming of methane (CL-SRM) over iron-based oxygen carriers. However, the process faces challenges such as high reaction temperature (>850 ℃) and low methane conversion. We demonstrate that Ni-mixed Fe-based oxygen carrier particles have significantly improved the methane conversion and hydrogen production rate in the range of 450-600 ℃ under atmospheric pressure. The effect on the reaction reactivity of oxygen carrier particles mixed with different Ni-based particle mass ratios has been determined in the continuous unit. More than 85% of methane conversion has been achieved at 600 ℃, and hydrogen can be produced in both reduction and oxidation steps. Moreover, the iron-based oxygen carrier particles exhibited good cyclic performance during 150 consecutive redox cycles at 600 ℃. The mid-temperature iron-based oxygen carrier particles, integrated with a moving-bed chemical looping system, might provide a powerful approach toward more efficient and scalable hydrogen production.

Keywords: chemical looping, hydrogen production, mid-temperature, oxygen carrier particles

Procedia PDF Downloads 143
11914 Enhancing the Effectiveness of Air Defense Systems through Simulation Analysis

Authors: F. Felipe

Abstract:

Air Defense Systems contain high-value assets that are expected to fulfill their mission for several years - in many cases, even decades - while operating in a fast-changing, technology-driven environment. Thus, it is paramount that decision-makers can assess how effective an Air Defense System is in the face of new developing threats, as well as to identify the bottlenecks that could jeopardize the security of the airspace of a country. Given the broad extent of activities and the great variety of assets necessary to achieve the strategic objectives, a systems approach was taken in order to delineate the core requirements and the physical architecture of an Air Defense System. Then, value-focused thinking helped in the definition of the measures of effectiveness. Furthermore, analytical methods were applied to create a formal structure that preliminarily assesses such measures. To validate the proposed methodology, a powerful simulation was also used to determine the measures of effectiveness, now in more complex environments that incorporate both uncertainty and multiple interactions of the entities. The results regarding the validity of this methodology suggest that the approach can support decisions aimed at enhancing the capabilities of Air Defense Systems. In conclusion, this paper sheds some light on how consolidated approaches of Systems Engineering and Operations Research can be used as valid techniques for solving problems regarding a complex and yet vital matter.

Keywords: air defense, effectiveness, system, simulation, decision-support

Procedia PDF Downloads 156
11913 System Identification of Building Structures with Continuous Modeling

Authors: Ruichong Zhang, Fadi Sawaged, Lotfi Gargab

Abstract:

This paper introduces a wave-based approach for system identification of high-rise building structures with a pair of seismic recordings, which can be used to evaluate structural integrity and detect damage in post-earthquake structural condition assessment. The fundamental of the approach is based on wave features of generalized impulse and frequency response functions (GIRF and GFRF), i.e., wave responses at one structural location to an impulsive motion at another reference location in time and frequency domains respectively. With a pair of seismic recordings at the two locations, GFRF is obtainable as Fourier spectral ratio of the two recordings, and GIRF is then found with the inverse Fourier transformation of GFRF. With an appropriate continuous model for the structure, a closed-form solution of GFRF, and subsequent GIRF, can also be found in terms of wave transmission and reflection coefficients, which are related to structural physical properties above the impulse location. Matching the two sets of GFRF and/or GIRF from recordings and the model helps identify structural parameters such as wave velocity or shear modulus. For illustration, this study examines ten-story Millikan Library in Pasadena, California with recordings of Yorba Linda earthquake of September 3, 2002. The building is modelled as piecewise continuous layers, with which GFRF is derived as function of such building parameters as impedance, cross-sectional area, and damping. GIRF can then be found in closed form for some special cases and numerically in general. Not only does this study reveal the influential factors of building parameters in wave features of GIRF and GRFR, it also shows some system-identification results, which are consistent with other vibration- and wave-based results. Finally, this paper discusses the effectiveness of the proposed model in system identification.

Keywords: wave-based approach, seismic responses of buildings, wave propagation in structures, construction

Procedia PDF Downloads 233
11912 Determining the Width and Depths of Cut in Milling on the Basis of a Multi-Dexel Model

Authors: Jens Friedrich, Matthias A. Gebele, Armin Lechler, Alexander Verl

Abstract:

Chatter vibrations and process instabilities are the most important factors limiting the productivity of the milling process. Chatter can leads to damage of the tool, the part or the machine tool. Therefore, the estimation and prediction of the process stability is very important. The process stability depends on the spindle speed, the depth of cut and the width of cut. In milling, the process conditions are defined in the NC-program. While the spindle speed is directly coded in the NC-program, the depth and width of cut are unknown. This paper presents a new simulation based approach for the prediction of the depth and width of cut of a milling process. The prediction is based on a material removal simulation with an analytically represented tool shape and a multi-dexel approach for the work piece. The new calculation method allows the direct estimation of the depth and width of cut, which are the influencing parameters of the process stability, instead of the removed volume as existing approaches do. The knowledge can be used to predict the stability of new, unknown parts. Moreover with an additional vibration sensor, the stability lobe diagram of a milling process can be estimated and improved based on the estimated depth and width of cut.

Keywords: dexel, process stability, material removal, milling

Procedia PDF Downloads 525
11911 Top-Down, Middle-Out, Bottom-Up: A Design Approach to Transforming Prison

Authors: Roland F. Karthaus, Rachel S. O'Brien

Abstract:

Over the past decade, the authors have undertaken applied research aimed at enabling transformation within the prison service to improve conditions and outcomes for those living, working and visiting in prisons in the UK and the communities they serve. The research has taken place against a context of reducing resources and public discontent at increasing levels of violence, deteriorating conditions and persistently high levels of re-offending. Top-down governmental policies have mainly been ineffectual and in some cases counter-productive. The prison service is characterised by hierarchical organisation, and the research has applied design thinking at multiple levels to challenge and precipitate change: top-down, middle-out and bottom-up. The research employs three distinct but related approaches, system design (top-down): working at the national policy level to analyse the changing policy context, identifying opportunities and challenges; engaging with the Ministry of Justice commissioners and sector organisations to facilitate debate, introducing new evidence and provoking creative thinking, place-based design (middle-out): working with individual prison establishments as pilots to illustrate and test the potential for local empowerment, creative change, and improved architecture within place-specific contexts and organisational hierarchies, everyday design (bottom-up): working with individuals in the system to explore the potential for localised, significant, demonstrator changes; including collaborative design, capacity building and empowerment in skills, employment, communication, training, and other activities. The research spans a series of projects, through which the methodological approach has developed responsively. The projects include a place-based model for the re-purposing of Ministry of Justice land assets for the purposes of rehabilitation; an evidence-based guide to improve prison design for health and well-being; capacity-based employment, skills and self-build project as a template for future open prisons. The overarching research has enabled knowledge to be developed and disseminated through policy and academic networks. Whilst the research remains live and continuing; key findings are emerging as a basis for a new methodological approach to effecting change in the UK prison service. An interdisciplinary approach is necessary to overcome the barriers between distinct areas of the prison service. Sometimes referred to as total environments, prisons encompass entire social and physical environments which themselves are orchestrated by institutional arms of government, resulting in complex systems that cannot be meaningfully engaged through narrow disciplinary lenses. A scalar approach is necessary to connect strategic policies with individual experiences and potential, through the medium of individual prison establishments, operating as discrete entities within the system. A reflexive process is necessary to connect research with action in a responsive mode, learning to adapt as the system itself is changing. The role of individuals in the system, their latent knowledge and experience and their ability to engage and become agents of change are essential. Whilst the specific characteristics of the UK prison system are unique, the approach is internationally applicable.

Keywords: architecture, design, policy, prison, system, transformation

Procedia PDF Downloads 133
11910 Sustainable Mitigation of Urban Stormwater Runoff: The Applicability of Green Infrastructure Approach in Finnish Climate

Authors: Rima Almalla

Abstract:

The purpose of the research project in Geography is to evaluate the applicability of urban green infrastructure approach in Finnish climate. The key focus will be on the operation and efficiency of green infrastructure on urban stormwater management. Green infrastructure approach refers to the employment of sufficient green covers as a modern and smart environmental solution to improve the quality of urban environments. Green infrastructure provides a wide variety of micro-scale ecosystem services, such as stormwater runoff management, regulation of extreme air temperatures, reduction of energy consumption, plus a variety of social benefits and human health and wellbeing. However, the cold climate of Finland with seasonal ground frost, snow cover and relatively short growing season bring about questions of whether green infrastructure works as efficiently as expected. To tackle this question, green infrastructure solutions will be studied and analyzed with manifold methods: stakeholder perspectives regarding existing and planned GI solutions will be collected by web based questionnaires, semi structured interviews and group discussions, and analyzed in both qualitative and quantitative methods. Targeted empirical field campaigns will be conducted on selected sites. A systematic literature review with global perspective will support the analyses. The findings will be collected, compiled and analyzed using geographic information systems (GIS). The findings of the research will improve our understanding of the functioning of green infrastructure in the Finnish environment in urban stormwater management, as a landscape element for citizens’ wellbeing, and in climate change mitigation and adaptation. The acquired information will be shared with stakeholders in interactive co-design workshops. As green covers have great demand and potential globally, the conclusions will have relevance in other cool climate regions and may support Finnish business in green infrastructure sector.

Keywords: climate change adaptation, climate change, green infrastructure, stormwater

Procedia PDF Downloads 167
11909 Creating Entrepreneurial Universities: The Swedish Approach of Transformation

Authors: Fawaz Saad, Hamid Alalwany

Abstract:

Sweden has succeeded to maintain a high level of growth and development and has managed to sustain highly ranked position among the world’s developed countries. In this regard, Swedish universities are playing a vital role in supporting innovation and entrepreneurship at all levels and developing Swedish knowledge economy. This paper is aiming to draw on the experiences of two leading Swedish universities, addressing their transformation approach to create entrepreneurial universities and fulfilling their objectives in the era of knowledge economy. The objectives of the paper include: (1) Introducing the Swedish higher education and its characteristics. (2) Examining the infrastructure elements for innovation and Entrepreneurship at two of the Swedish entrepre-neurial universities. (3) Addressing the key aspects of support systems in the initiatives of both Chalmers and Gothenburg universities to support innovation and advance entrepreneurial practices. The paper will contribute to two discourses: (1) Examining the relationship between support systems for innovation and entrepreneurship and the Universities’ policies and practices. (2) Lessons for University leaders to assist the development and implementation of effective innovation and en-trepreneurship policies and practices.

Keywords: Entrepreneurial University, Chalmers University, Gothenburg University, innovation and entrepreneurship policies, entrepreneurial transformation

Procedia PDF Downloads 507
11908 Spatial Patterns and Temporal Evolution of Octopus Abundance in the Mauritanian Zone

Authors: Dedah Ahmed Babou, Nicolas Bez

Abstract:

The Min-Max autocorrelation factor (MAF) approach makes it possible to express in a space formed by spatially independent factors, spatiotemporal observations. These factors are ordered in decreasing order of spatial autocorrelation. The starting observations are thus expressed in the space formed by these factors according to temporal coordinates. Each vector of temporal coefficients expresses the temporal evolution of the weight of the corresponding factor. Applying this approach has enabled us to achieve the following results: (i) Define a spatially orthogonal space in which the projections of the raw data are determined; (ii) Define a limit threshold for the factors with the strongest structures in order to analyze the weight, and the temporal evolution of these different structures (iii) Study the correlation between the temporal evolution of the persistent spatial structures and that of the observed average abundance (iv) Propose prototypes of campaigns reflecting a high vs. low abundance (v) Propose a classification of campaigns that highlights seasonal and/or temporal similarities. These results were obtained by analyzing the octopus yield during the scientific campaigns of the oceanographic vessel Al Awam during the period 1989-2017 in the Mauritanian exclusive economic zone.

Keywords: spatiotemporal , autocorrelation, kriging, variogram, Octopus vulgaris

Procedia PDF Downloads 147
11907 3D Printing for Maritime Cultural Heritage: A Design for All Approach to Public Interpretation

Authors: Anne Eugenia Wright

Abstract:

This study examines issues in accessibility to maritime cultural heritage. Using the Pillar Dollar Wreck in Biscayne National Park, Florida, this study presents an approach to public outreach based on the concept of Design for All. Design for All advocates creating products that are accessible and functional for all users, including those with visual, hearing, learning, mobility, or economic impairments. As a part of this study, a small exhibit was created that uses 3D products as a way to bring maritime cultural heritage to the public. It was presented to the public at East Carolina University’s Joyner Library. Additionally, this study presents a methodology for 3D printing scaled photogrammetry models of archaeological sites in full color. This methodology can be used to present a realistic depiction of underwater archaeological sites to those who are incapable of accessing them in the water. Additionally, this methodology can be used to present underwater archaeological sites that are inaccessible to the public due to conditions such as visibility, depth, or protected status. This study presents a practical use for 3D photogrammetry models, as well as an accessibility strategy to expand the outreach potential for maritime archaeology.

Keywords: Underwater Archaeology, 3D Printing, Photogrammetry, Design for All

Procedia PDF Downloads 138
11906 A New Approach of Preprocessing with SVM Optimization Based on PSO for Bearing Fault Diagnosis

Authors: Tawfik Thelaidjia, Salah Chenikher

Abstract:

Bearing fault diagnosis has attracted significant attention over the past few decades. It consists of two major parts: vibration signal feature extraction and condition classification for the extracted features. In this paper, feature extraction from faulty bearing vibration signals is performed by a combination of the signal’s Kurtosis and features obtained through the preprocessing of the vibration signal samples using Db2 discrete wavelet transform at the fifth level of decomposition. In this way, a 7-dimensional vector of the vibration signal feature is obtained. After feature extraction from vibration signal, the support vector machine (SVM) was applied to automate the fault diagnosis procedure. To improve the classification accuracy for bearing fault prediction, particle swarm optimization (PSO) is employed to simultaneously optimize the SVM kernel function parameter and the penalty parameter. The results have shown feasibility and effectiveness of the proposed approach

Keywords: condition monitoring, discrete wavelet transform, fault diagnosis, kurtosis, machine learning, particle swarm optimization, roller bearing, rotating machines, support vector machine, vibration measurement

Procedia PDF Downloads 437
11905 The Reduction of CO2 Emissions Level in Malaysian Transportation Sector: An Optimization Approach

Authors: Siti Indati Mustapa, Hussain Ali Bekhet

Abstract:

Transportation sector represents more than 40% of total energy consumption in Malaysia. This sector is a major user of fossils based fuels, and it is increasingly being highlighted as the sector which contributes least to CO2 emission reduction targets. Considering this fact, this paper attempts to investigate the problem of reducing CO2 emission using linear programming approach. An optimization model which is used to investigate the optimal level of CO2 emission reduction in the road transport sector is presented. In this paper, scenarios have been used to demonstrate the emission reduction model: (1) utilising alternative fuel scenario, (2) improving fuel efficiency scenario, (3) removing fuel subsidy scenario, (4) reducing demand travel, (5) optimal scenario. This study finds that fuel balancing can contribute to the reduction of the amount of CO2 emission by up to 3%. Beyond 3% emission reductions, more stringent measures that include fuel switching, fuel efficiency improvement, demand travel reduction and combination of mitigation measures have to be employed. The model revealed that the CO2 emission reduction in the road transportation can be reduced by 38.3% in the optimal scenario.

Keywords: CO2 emission, fuel consumption, optimization, linear programming, transportation sector, Malaysia

Procedia PDF Downloads 423
11904 Non-Conformance Clearance through an Intensified Mentorship towards ISO 15189 Accreditation: The Case of Jimma and Hawassa Hospital Microbiology Laboratories, Ethiopia

Authors: Dawit Assefa, Kassaye Tekie, Gebrie Alebachew, Degefu Beyene, Bikila Alemu, Naji Mohammed, Asnakech Agegnehu, Seble Tsehay, Geremew Tasew

Abstract:

Background: Implementation of a Laboratory Quality Management System (LQMS) is critical to ensure accurate, reliable, and efficient laboratory testing of antimicrobial resistance (AMR). However, limited LQMS implementation and progress toward accreditation in the AMR surveillance laboratory testing setting exist in Ethiopia. By addressing non-conformances (NCs) and working towards accreditation, microbiology laboratories can improve the quality of their services, increase staff competence, and contribute to mitigate the spread of AMR. Methods: Using standard ISO 15189 horizontal and vertical assessment checklists, certified assessors identified NCs at Hawassa and Jimma Hospital microbiology laboratories. The Ethiopian Public Health Institute AMR mentors and IDDS staff prioritized closing the NCs through the implementation of an intensified mentorship program that included ISO 15189 orientation training, resource allocation, and action plan development. Results: For the two facilities to clear their NCs, an intensified mentorship approach was adopted by providing ISO 15189 orientation training, provision of buffer reagents, controls, standards, and axillary equipment, and facilitating equipment maintenance and calibration. Method verification and competency assessment were also conducted along with the implementation of standard operating procedures and recommended corrective actions. This approach enhanced the laboratory's readiness for accreditation. After addressing their NCs, the two laboratories applied to Ethiopian Accreditation Services for ISO 15189 accreditation. Conclusions: Clearing NCs through the implementation of intensified mentorship was crucial in preparing the two laboratories for accreditation and improving quality laboratory test results. This approach can guide other microbiology laboratories’ accreditation attainment efforts.

Keywords: non-conformance clearance, intensified mentorship, accreditation, ISO 15189

Procedia PDF Downloads 92
11903 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 96
11902 An Overview of Bioclimatic Design Strategies for Energy Efficient Buildings: A Case Study of Semi-Arid Climate, Lahore

Authors: Beenish Mujahid, Sana Malik

Abstract:

Bioclimatic design Strategies plays a dynamic role in construction of Sustainable Buildings. This approach leads to reduction in the mechanical cooling of building which provides comfort to the occupants in sustainable manner. Such bioclimatic measures provide a complete framework of building design through responding to climatic features of particular site. The featured Passive cooling techniques for hot climatic region provides comfortable indoor temperature with ecological and financial benefits. The study is based on highlighting this approach to produce energy efficient buildings for Semi-Arid climate like Lahore, Pakistan. Being part of developing country, energy savings in Lahore city would help the Power Sector and resolves the World Issues of Global Warming and Ozone Layer Depletion. This article reviews the bioclimatic design strategies and their critical analysis to drive guidelines for Sustainable buildings in Lahore. The study shows that the demand for mechanical cooling systems including air conditioning, fans, and air coolers can be reduced through regional climatic design.

Keywords: bioclimatic design, buildings, comfort, energy efficient, Lahore

Procedia PDF Downloads 271
11901 Psychotherapeutic Narratives and the Importance of Truth

Authors: Spencer Jay Knafelc

Abstract:

Some mental health practitioners and theorists have suggested that we approach remedying psychological problems by centering and intervening upon patients’ narrations. Such theorists and their corresponding therapeutic approaches see persons as narrators of their lives, where the stories they tell constitute and reflect their sense-making of the world. Psychological problems, according to these approaches to therapy, are often the result of problematic narratives. The solution is the construction of more salubrious narratives through therapy. There is trouble lurking within the history of these narrative approaches. These thinkers tend to denigrate the importance of truth, insisting that narratives are not to be thought of as aiming at truth, and thus the truth of our self-narratives is not important. There are multiple motivations for the tendency to eschew truth’s importance within the tradition of narrative approaches to therapy. The most plausible and interesting motivation comes from the observation that, in general, all dominant approaches to therapy are equally effective. The theoretical commitments of each approach are quite different and are often ostensibly incompatible (psychodynamic therapists see psychological problems as resulting from unconscious conflict and repressed desires, Cognitive-Behavioral approaches see them as resulting from distorted cognitions). This strongly suggests that there must be some cases in which therapeutic efficacy does not depend on truth and that insisting that patient’s therapeutic narratives be true in all instances is a mistake. Lewis’ solution is to suggest that narratives are metaphors. Lewis’ account appreciates that there are many ways to tell a story and that many different approaches to mental health treatment can be appropriate without committing us to any contradictions, providing us with an ostensibly coherent way to treat narratives as non-literal, instead of seeing them as tools that can be more or less apt. Here, it is argued that Lewis’ metaphor approach fails. Narratives do not have the right kind of structure to be metaphors. Still, another way to understand Lewis’ view might be that self-narratives, especially when articulated in the language of any specific approach, should not be taken literally. This is an idea at the core of the narrative theorists’ tendency to eschew the importance of the ordinary understanding of truth. This very tendency will be critiqued. The view defended in this paper more accurately captures the nature of self-narratives. The truth of one’s self-narrative is important. Not only do people care about having the right conception of their abilities, who they are, and the way the world is, but self-narratives are composed of beliefs, and the nature of belief is to aim at truth. This view also allows the recognition of the importance of developing accurate representations of oneself and reality for one’s psychological well-being. It is also argued that in many cases, truth factors in as a mechanism of change over the course of therapy. Therapeutic benefit can be achieved by coming to have a better understanding of the nature of oneself and the world. Finally, the view defended here allows for the recognition of the nature of the tension between values: truth and efficacy. It is better to recognize this tension and develop strategies to navigate it as opposed to insisting that it doesn’t exist.

Keywords: philosophy, narrative, psychotherapy, truth

Procedia PDF Downloads 104
11900 Determinants of Life Satisfaction in Canada: A Causal Modelling Approach

Authors: Rose Branch-Allen, John Jayachandran

Abstract:

Background and purpose: Canada is a pluralistic, multicultural society with an ethno-cultural composition that has been shaped over time by immigrants and their descendants. Although Canada welcomes these immigrants, many will endure hardship and assimilation difficulties. Despite these life hurdles, surveys consistently disclose high life satisfaction for all Canadians. Most research studies on Life Satisfaction/ Subjective Wellbeing (SWB) have focused on one main determinant and a variety of social demographic variables to delineate the determinants of life satisfaction. However, very few research studies examine life satisfaction from a holistic approach. In addition, we need to understand the causal pathways leading to life satisfaction, and develop theories that explain why certain variables differentially influence the different components of SWB. The aim this study was to utilize a holistic approach to construct a causal model and identify major determinants of life satisfaction. Data and measures: This study utilized data from the General Social Survey, with a sample size of 19, 597. The exogenous concepts included age, gender, marital status, household size, socioeconomic status, ethnicity, location, immigration status, religiosity, and neighborhood. The intervening concepts included health, social contact, leisure, enjoyment, work-family balance, quality time, domestic labor, and sense of belonging. The endogenous concept life satisfaction was measured by multiple indicators (Cronbach’s alpha = .83). Analysis: Several multiple regression models were run sequentially to estimate path coefficients for the causal model. Results: Overall, above average satisfaction with life was reported for respondents with specific socio-economic, demographic and lifestyle characteristics. With regard to exogenous factors, respondents who were female, younger, married, from high socioeconomic status background, born in Canada, very religious, and demonstrated high level of neighborhood interaction had greater satisfaction with life. Similarly, intervening concepts suggested respondents had greater life satisfaction if they had better health, more social contact, less time on passive leisure activities and more time on active leisure activities, more time with family and friends, more enjoyment with volunteer activities, less time on domestic labor and a greater sense of belonging to the community. Conclusions and Implications: Our results suggest that a holistic approach is necessary for establishing determinants of life satisfaction, and that life satisfaction is not merely comprised of positive or negative affect rather understanding the causal process of life satisfaction. Even though, most of our findings are consistent with previous studies, a significant number of causal connections contradict some of the findings in literature today. We have provided possible explanation for these anomalies researchers encounter in studying life satisfaction and policy implications.

Keywords: causal model, holistic approach, life satisfaction, socio-demographic variables, subjective well-being

Procedia PDF Downloads 357
11899 Contextualizing Torture in Closed Institutions

Authors: Erinda Bllaca Ndroqi

Abstract:

The dilemma with which the monitoring professionals are facing in today’s reality is whether to accept that prisons all over the world constitute a place where not all rights are respected (ethical approach), or widen the scope of monitoring by prioritizing the special needs of people deprived of their liberties (human right approach), despite the context and the level of improved prison condition, staff profiling, more services oriented towards rehabilitation instead of punishment. Such dilemma becomes a concern if taking into consideration the fact that prisoners, due to their powerlessness and 'their lives at the hand of the state', are constantly under the threat of abuse of power and neglect, which in the Albanian case, has never been classified as torture. Scientific research in twenty-four (24) Albanian prisons shows that for some rights, prisoners belonging to 'vulnerable groups' such as mental illness, HIV positive status, sexual orientation, and terminal illness remain quite challenged and do not ensure that their basic rights are being met by the current criminal justice system (despite recommendations set forwards to prison authorities by the European Committee for the Prevention of Torture and Inhuman or Degrading Treatment or Punishment (CPT)). The research orients more discussion about policy and strategic recommendations that would need a thorough assessment of the impact of rehabilitation in special categories of prisoners, including recidivists.

Keywords: prisons, rehabilitation, torture, vulnerability

Procedia PDF Downloads 129
11898 Uneven Habitat Characterisation by Using Geo-Gebra Software in the Lacewings (Insecta: Neuroptera), Knowing When to Calculate the Habitat: Creating More Informative Ecological Experiments

Authors: Hakan Bozdoğan

Abstract:

A wide variety of traditional methodologies has been enhanced for characterising smooth habitats in order to find out different environmental objectives. The habitats were characterised based on size and shape by using Geo-Gebra Software. In this study, an innovative approach to researching habitat characterisation in the lacewing species, GeoGebra software is utilised. This approach is demonstrated using the example of ‘surface area’ as an analytical concept, wherein the goal was to increase clearness for researchers, and to improve the quality of researching in survey area. In conclusion, habitat characterisation using the mathematical programme provides a unique potential to collect more comprehensible and analytical information about in shapeless areas beyond the range of direct observations methods. This research contributes a new perspective for assessing the structure of habitat, providing a novel mathematical tool for the research and management of such habitats and environments. Further surveys should be undertaken at additional sites within the Amanos Mountains for a comprehensive assessment of lacewings habitat characterisation in an analytical plane. This paper is supported by Ahi Evran University Scientific Research Projects Coordination Unit, Projects No:TBY.E2.17.001 and TBY.A4.16.001.

Keywords: uneven habitat shape, habitat assessment, lacewings, Geo-Gebra Software

Procedia PDF Downloads 284