Search results for: Fairclough’s approach
11944 A Risk-Based Modeling Approach for Successful Adoption of CAATTs in Audits: An Exploratory Study Applied to Israeli Accountancy Firms
Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy
Abstract:
Technology adoption models are extensively used in the literature to explore drivers and inhibitors affecting the adoption of Computer Assisted Audit Techniques and Tools (CAATTs). Further studies from recent years suggested additional factors that may affect technology adoption by CPA firms. However, the adoption of CAATTs by financial auditors differs from the adoption of technologies in other industries. This is a result of the unique characteristics of the auditing process, which are expressed in the audit risk elements and the risk-based auditing approach, as encoded in the auditing standards. Since these audit risk factors are not part of the existing models that are used to explain technology adoption, these models do not fully correspond to the specific needs and requirements of the auditing domain. The overarching objective of this qualitative research is to fill the gap in the literature, which exists as a result of using generic technology adoption models. Followed by a pretest and based on semi-structured in-depth interviews with 16 Israeli CPA firms of different sizes, this study aims to reveal determinants related to audit risk factors that influence the adoption of CAATTs in audits and proposes a new modeling approach for the successful adoption of CAATTs. The findings emphasize several important aspects: (1) while large CPA firms developed their own inner guidelines to assess the audit risk components, other CPA firms do not follow a formal and validated methodology to evaluate these risks; (2) large firms incorporate a variety of CAATTs, including self-developed advanced tools. On the other hand, small and mid-sized CPA firms incorporate standard CAATTs and still need to catch up to better understand what CAATTs can offer and how they can contribute to the quality of the audit; (3) the top management of mid-sized and small CPA firms should be more proactive and updated about CAATTs capabilities and contributions to audits; and (4) All CPA firms consider professionalism as a major challenge that must be constantly managed to ensure an optimal CAATTs operation. The study extends the existing knowledge of CAATTs adoption by looking at it from a risk-based auditing approach. It suggests a new model for CAATTs adoption by incorporating influencing audit risk factors that auditors should examine when considering CAATTs adoption. Since the model can be used in various audited scenarios and supports strategic, risk-based decisions, it maximizes the great potential of CAATTs on the quality of the audits. The results and insights can be useful to CPA firms, internal auditors, CAATTs developers and regulators. Moreover, it may motivate audit standard-setters to issue updated guidelines regarding CAATTs adoption in audits.Keywords: audit risk, CAATTs, financial auditing, information technology, technology adoption models
Procedia PDF Downloads 6511943 Teaching Swahili as a Foreign Languages to Young People in South Africa
Authors: Elizabeth Mahenge
Abstract:
Unemployment is a problem that face many graduates all over the world. Every year universities in many parts of the world produce graduates who are looking for an employment. Swahili, a Bantu language originated in East African coast, can be used as an avenue for youth’s employment in South Africa. This paper helps youth to know about job opportunities available through teaching Swahili language. The objective of this paper is capacity building to youths to be teachers of Swahili and be ready to compete in the marketplace. The methodology was through two weeks online training on how to teach Swahili as a foreign language. The communicative approach and task-based approach were used. Participants to this training were collected through a WhatsApp group advertisement about “short training for Swahili teachers for foreigners”. A total number of 30 participants registered but only 11 attended the training. Training was online via zoom. The contribution of this paper is that by being fluent in Swahili one would benefit with teaching job opportunities anywhere in the world. Hence the problem of unemployment among the youths would be reduced as they can employ themselves or being employed in academic institutions anywhere in the world. The paper calls for youths in South Africa to opt for Swahili language courses to be trained and become experts in the teaching Swahili as a foreign language.Keywords: foreign language, linguistic market, Swahili, employment
Procedia PDF Downloads 7311942 Implementing Action Research in EFL/ESL Classrooms: A Systematic Review of Literature 2010-2019
Authors: Amira D. Ali
Abstract:
Action research studies in education often address learners’ needs and empower practitioner-researcher to effectively change instructional practices and school communities. A systematic review of action research (AR) studies undertaken in EFL/ESL settings was conducted in this paper to systematically analyze empirical studies on action research published within a ten-year period (between 2010 and 2019). The review also aimed at investigating the focal strategies in teaching the language skills at school level and evaluating the overall quality of AR studies concerning focus, purpose, methodology and contribution. Inclusion criteria were established and 41 studies that fit were finally selected for the systematic review. Garrard’s (2007) Matrix Method was used to structure and synthesize the literature. Results showed a significant diversity in teaching strategies and implementation of the AR model. Almost a quarter of the studies focused on improving writing skills at elementary school level. In addition, findings revealed that (44%) of the studies used a mixed approach followed by qualitative method approach (41%), whereas only (15%) employed quantitative methodology. Research gaps for future action research in developing language skills were pointed out, and recommendations were offered.Keywords: action research, EFL/ESL context, language skills, systematic review
Procedia PDF Downloads 13711941 A Robust Model Predictive Control for a Photovoltaic Pumping System Subject to Actuator Saturation Nonlinearity and Parameter Uncertainties: A Linear Matrix Inequality Approach
Authors: Sofiane Bououden, Ilyes Boulkaibet
Abstract:
In this paper, a robust model predictive controller (RMPC) for uncertain nonlinear system under actuator saturation is designed to control a DC-DC buck converter in PV pumping application, where this system is subject to actuator saturation and parameter uncertainties. The considered nonlinear system contains a linear constant part perturbed by an additive state-dependent nonlinear term. Based on the saturating actuator property, an appropriate linear feedback control law is constructed and used to minimize an infinite horizon cost function within the framework of linear matrix inequalities. The proposed approach has successfully provided a solution to the optimization problem that can stabilize the nonlinear plants. Furthermore, sufficient conditions for the existence of the proposed controller guarantee the robust stability of the system in the presence of polytypic uncertainties. In addition, the simulation results have demonstrated the efficiency of the proposed control scheme.Keywords: PV pumping system, DC-DC buck converter, robust model predictive controller, nonlinear system, actuator saturation, linear matrix inequality
Procedia PDF Downloads 17911940 Basic Business-Forces behind the Surviving and Sustainable Organizations: The Case of Medium Scale Contractors in South Africa
Authors: Iruka C. Anugwo, Winston M. Shakantu
Abstract:
The objective of this study is to uncover the basic business-forces that necessitated the survival and sustainable performance of the medium scale contractors in the South African construction market. This study is essential as it set to contribute towards long-term strategic solutions for combating the incessant failure of start-ups construction organizations within South African. The study used a qualitative research methodology; as the most appropriate approach to elicit and understand, and uncover the phenomena that are basic business-forces for the active contractors in the market. The study also adopted a phenomenological study approach; and in-depth interviews were conducted with 20 medium scale contractors in Port Elizabeth, South Africa, between months of August to October 2015. This allowed for an in-depth understanding of the critical and basic business-forces that influenced their survival and performance beyond the first five years of business operation. Findings of the study showed that for potential contractors (startups), to survival in the competitive business environment such as construction industry, they must possess the basic business-forces. These forces are educational knowledge in construction and business management related disciplines, adequate industrial experiences, competencies and capabilities to delivery excellent services and products as well as embracing the spirit of entrepreneurship. Convincingly, it can be concluded that the strategic approach to minimize the endless failure of startups construction businesses; the potential construction contractors must endeavoring to access and acquire the basic educationally knowledge, training and qualification; need to acquire industrial experiences in collaboration with required competencies, capabilities and entrepreneurship acumen. Without these basic business-forces as been discovered in this study, the majority of the contractors gaining entrance in the market will find it difficult to develop and grow a competitive and sustainable construction organization in South Africa.Keywords: basic business-forces, medium scale contractors, South Africa, sustainable organisations
Procedia PDF Downloads 29011939 Implementation of Building Information Modeling in Turkish Government Sector Projects
Authors: Mohammad Lemar Zalmai, Mustafa Nabi Kocakaya, Cemil Akcay, Ekrem Manisali
Abstract:
In recent years, the Building Information Modeling (BIM) approach has been developed expeditiously. As people see the benefits of this approach, it has begun to be used widely in construction projects and some countries made it mandatory to get more benefits from it. To promote the implementation of BIM in construction projects, it will be helpful to get some relevant information from surveys and interviews. The purpose of this study is to research the current adoption and implementation of BIM in public projects in Turkey. This study specified the challenges of BIM implementation in Turkey and proposed some solutions to overcome them. In this context, the challenges for BIM implementation and the factors that affect the BIM usage are determined based on previous academic researches and expert opinions by conducting interviews and questionnaire surveys. Several methods are used to process information in order to obtain weights of different factors to make BIM widespread in Turkey. This study concluded interviews' and questionnaire surveys' outcomes and proposed some suggestions to promote the implementation of BIM in Turkey. We believe research findings will be a good reference for boosting BIM implementation in Turkey.Keywords: building information modelling, BIM implementations, Turkish construction industry, Turkish government sector projects
Procedia PDF Downloads 13711938 Bubble Point Pressures of CO2+Ethyl Palmitate by a Cubic Equation of State and the Wong-Sandler Mixing Rule
Authors: M. A. Sedghamiz, S. Raeissi
Abstract:
This study presents three different approaches to estimate bubble point pressures for the binary system of CO2 and ethyl palmitate fatty acid ethyl ester. The first method involves the Peng-Robinson (PR) Equation of State (EoS) with the conventional mixing rule of Van der Waals. The second approach involves the PR EOS together with the Wong Sandler (WS) mixing rule, coupled with the Uniquac Ge model. In order to model the bubble point pressures with this approach, the volume and area parameter for ethyl palmitate were estimated by the Hansen group contribution method. The last method involved the Peng-Robinson, combined with the Wong-Sandler Method, but using NRTL as the GE model. Results using the Van der Waals mixing rule clearly indicated that this method has the largest errors among all three methods, with errors in the range of 3.96–6.22 %. The Pr-Ws-Uniquac method exhibited small errors, with average absolute deviations between 0.95 to 1.97 percent. The Pr-Ws-Nrtl method led to the least errors where average absolute deviations ranged between 0.65-1.7%.Keywords: bubble pressure, Gibbs excess energy model, mixing rule, CO2 solubility, ethyl palmitate
Procedia PDF Downloads 47311937 A Location-Based Search Approach According to Users’ Application Scenario
Authors: Shih-Ting Yang, Chih-Yun Lin, Ming-Yu Li, Jhong-Ting Syue, Wei-Ming Huang
Abstract:
Global positioning system (GPS) has become increasing precise in recent years, and the location-based service (LBS) has developed rapidly. Take the example of finding a parking lot (such as Parking apps). The location-based service can offer immediate information about a nearby parking lot, including the information about remaining parking spaces. However, it cannot provide expected search results according to the requirement situations of users. For that reason, this paper develops a “Location-based Search Approach according to Users’ Application Scenario” according to the location-based search and demand determination to help users obtain the information consistent with their requirements. The “Location-based Search Approach based on Users’ Application Scenario” of this paper consists of one mechanism and three kernel modules. First, in the Information Pre-processing Mechanism (IPM), this paper uses the cosine theorem to categorize the locations of users. Then, in the Information Category Evaluation Module (ICEM), the kNN (k-Nearest Neighbor) is employed to classify the browsing records of users. After that, in the Information Volume Level Determination Module (IVLDM), this paper makes a comparison between the number of users’ clicking the information at different locations and the average number of users’ clicking the information at a specific location, so as to evaluate the urgency of demand; then, the two-dimensional space is used to estimate the application situations of users. For the last step, in the Location-based Search Module (LBSM), this paper compares all search results and the average number of characters of the search results, categorizes the search results with the Manhattan Distance, and selects the results according to the application scenario of users. Additionally, this paper develops a Web-based system according to the methodology to demonstrate practical application of this paper. The application scenario-based estimate and the location-based search are used to evaluate the type and abundance of the information expected by the public at specific location, so that information demanders can obtain the information consistent with their application situations at specific location.Keywords: data mining, knowledge management, location-based service, user application scenario
Procedia PDF Downloads 12311936 Image Based Landing Solutions for Large Passenger Aircraft
Authors: Thierry Sammour Sawaya, Heikki Deschacht
Abstract:
In commercial aircraft operations, almost half of the accidents happen during approach or landing phases. Automatic guidance and automatic landings have proven to bring significant safety value added for this challenging landing phase. This is why Airbus and ScioTeq have decided to work together to explore the capability of image-based landing solutions as additional landing aids to further expand the possibility to perform automatic approach and landing to runways where the current guiding systems are either not fitted or not optimum. Current systems for automated landing often depend on radio signals provided by airport ground infrastructure on the airport or satellite coverage. In addition, these radio signals may not always be available with the integrity and performance required for safe automatic landing. Being independent from these radio signals would widen the operations possibilities and increase the number of automated landings. Airbus and ScioTeq are joining their expertise in the field of Computer Vision in the European Program called Clean Sky 2 Large Passenger Aircraft, in which they are leading the IMBALS (IMage BAsed Landing Solutions) project. The ultimate goal of this project is to demonstrate, develop, validate and verify a certifiable automatic landing system guiding an airplane during the approach and landing phases based on an onboard camera system capturing images, enabling automatic landing independent from radio signals and without precision instrument for landing. In the frame of this project, ScioTeq is responsible for the development of the Image Processing Platform (IPP), while Airbus is responsible for defining the functional and system requirements as well as the testing and integration of the developed equipment in a Large Passenger Aircraft representative environment. The aim of this paper will be to describe the system as well as the associated methods and tools developed for validation and verification.Keywords: aircraft landing system, aircraft safety, autoland, avionic system, computer vision, image processing
Procedia PDF Downloads 9911935 Second Harmonic Generation of Higher-Order Gaussian Laser Beam in Density Rippled Plasma
Authors: Jyoti Wadhwa, Arvinder Singh
Abstract:
This work presents the theoretical investigation of an enhanced second-harmonic generation of higher-order Gaussian laser beam in plasma having a density ramp. The mechanism responsible for the self-focusing of a laser beam in plasma is considered to be the relativistic mass variation of plasma electrons under the effect of a highly intense laser beam. Using the moment theory approach and considering the Wentzel-Kramers-Brillouin approximation for the non-linear Schrodinger wave equation, the differential equation is derived, which governs the spot size of the higher-order Gaussian laser beam in plasma. The nonlinearity induced by the laser beam creates the density gradient in the background plasma electrons, which is responsible for the excitation of the electron plasma wave. The large amplitude electron plasma wave interacts with the fundamental beam, which further produces the coherent radiations with double the frequency of the incident beam. The analysis shows the important role of the different modes of higher-order Gaussian laser beam and density ramp on the efficiency of generated harmonics.Keywords: density rippled plasma, higher order Gaussian laser beam, moment theory approach, second harmonic generation.
Procedia PDF Downloads 17611934 Approach to Formulate Intuitionistic Fuzzy Regression Models
Authors: Liang-Hsuan Chen, Sheng-Shing Nien
Abstract:
This study aims to develop approaches to formulate intuitionistic fuzzy regression (IFR) models for many decision-making applications in the fuzzy environments using intuitionistic fuzzy observations. Intuitionistic fuzzy numbers (IFNs) are used to characterize the fuzzy input and output variables in the IFR formulation processes. A mathematical programming problem (MPP) is built up to optimally determine the IFR parameters. Each parameter in the MPP is defined as a couple of alternative numerical variables with opposite signs, and an intuitionistic fuzzy error term is added to the MPP to characterize the uncertainty of the model. The IFR model is formulated based on the distance measure to minimize the total distance errors between estimated and observed intuitionistic fuzzy responses in the MPP resolution processes. The proposed approaches are simple/efficient in the formulation/resolution processes, in which the sign of parameters can be determined so that the problem to predetermine the sign of parameters is avoided. Furthermore, the proposed approach has the advantage that the spread of the predicted IFN response will not be over-increased, since the parameters in the established IFR model are crisp. The performance of the obtained models is evaluated and compared with the existing approaches.Keywords: fuzzy sets, intuitionistic fuzzy number, intuitionistic fuzzy regression, mathematical programming method
Procedia PDF Downloads 13811933 Robust Fractional Order Controllers for Minimum and Non-Minimum Phase Systems – Studies on Design and Development
Authors: Anand Kishore Kola, G. Uday Bhaskar Babu, Kotturi Ajay Kumar
Abstract:
The modern dynamic systems used in industries are complex in nature and hence the fractional order controllers have been contemplated as a fresh approach to control system design that takes the complexity into account. Traditional integer order controllers use integer derivatives and integrals to control systems, whereas fractional order controllers use fractional derivatives and integrals to regulate memory and non-local behavior. This study provides a method based on the maximumsensitivity (Ms) methodology to discover all resilient fractional filter Internal Model Control - proportional integral derivative (IMC-PID) controllers that stabilize the closed-loop system and deliver the highest performance for a time delay system with a Smith predictor configuration. Additionally, it helps to enhance the range of PID controllers that are used to stabilize the system. This study also evaluates the effectiveness of the suggested controller approach for minimum phase system in comparison to those currently in use which are based on Integral of Absolute Error (IAE) and Total Variation (TV).Keywords: modern dynamic systems, fractional order controllers, maximum-sensitivity, IMC-PID controllers, Smith predictor, IAE and TV
Procedia PDF Downloads 6311932 Studying Language of Immediacy and Language of Distance from a Corpus Linguistic Perspective: A Pilot Study of Evaluation Markers in French Television Weather Reports
Authors: Vince Liégeois
Abstract:
Language of immediacy and distance: Within their discourse theory, Koch & Oesterreicher establish a distinction between a language of immediacy and a language of distance. The former refers to those discourses which are oriented more towards a spoken norm, whereas the latter entails discourses oriented towards a written norm, regardless of whether they are realised phonically or graphically. This means that an utterance can be realised phonically but oriented more towards the written language norm (e.g., a scientific presentation or eulogy) or realised graphically but oriented towards a spoken norm (e.g., a scribble or chat messages). Research desiderata: The methodological approach from Koch & Oesterreicher has often been criticised for not providing a corpus-linguistic methodology, which makes it difficult to work with quantitative data or address large text collections within this research paradigm. Consequently, the Koch & Oesterreicher approach has difficulties gaining ground in those research areas which rely more on corpus linguistic research models, like text linguistics and LSP-research. A combinatory approach: Accordingly, we want to establish a combinatory approach with corpus-based linguistic methodology. To this end, we propose to (i) include data about the context of an utterance (e.g., monologicity/dialogicity, familiarity with the speaker) – which were called “conditions of communication” in the original work of Koch & Oesterreicher – and (ii) correlate the linguistic phenomenon at the centre of the inquiry (e.g., evaluation markers) to a group of linguistic phenomena deemed typical for either distance- or immediacy-language. Based on these two parameters, linguistic phenomena and texts could then be mapped on an immediacy-distance continuum. Pilot study: To illustrate the benefits of this approach, we will conduct a pilot study on evaluation phenomena in French television weather reports, a form of domain-sensitive discourse which has often been cited as an example of a “text genre”. Within this text genre, we will look at so-called “evaluation markers,” e.g., fixed strings like bad weather, stifling hot, and “no luck today!”. These evaluation markers help to communicate the coming weather situation towards the lay audience but have not yet been studied within the Koch & Oesterreicher research paradigm. Accordingly, we want to figure out whether said evaluation markers are more typical for those weather reports which tend more towards immediacy or those which tend more towards distance. To this aim, we collected a corpus with different kinds of television weather reports,e.g., as part of the news broadcast, including dialogue. The evaluation markers themselves will be studied according to the explained methodology, by correlating them to (i) metadata about the context and (ii) linguistic phenomena characterising immediacy-language: repetition, deixis (personal, spatial, and temporal), a freer choice of tense and right- /left-dislocation. Results: Our results indicate that evaluation markers are more dominantly present in those weather reports inclining towards immediacy-language. Based on the methodology established above, we have gained more insight into the working of evaluation markers in the domain-sensitive text genre of (television) weather reports. For future research, it will be interesting to determine whether said evaluation markers are also typical for immediacy-language-oriented in other domain-sensitive discourses.Keywords: corpus-based linguistics, evaluation markers, language of immediacy and distance, weather reports
Procedia PDF Downloads 21711931 An Ant Colony Optimization Approach for the Pollution Routing Problem
Authors: P. Parthiban, Sonu Rajak, N. Kannan, R. Dhanalakshmi
Abstract:
This paper deals with the Vehicle Routing Problem (VRP) with environmental considerations which is called Pollution Routing Problem (PRP). The objective is to minimize the operational and environmental costs. It consists of routing a number of vehicles to serve a set of customers, and determining fuel consumption, driver wages and their speed on each route segment, while respecting the capacity constraints and time windows. In this context, we presented an Ant Colony Optimization (ACO) approach, combined with a Speed Optimization Algorithm (SOA) to solve the PRP. The proposed solution method consists of two stages. Stage one is to solve a Vehicle Routing Problem with Time Window (VRPTW) using ACO and in the second stage a SOA is run on the resulting VRPTW solutions. Given a vehicle route, the SOA consists of finding the optimal speed on each arc of the route in order to minimize an objective function comprising fuel consumption costs and driver wages. The proposed algorithm tested on benchmark problem, the preliminary results show that the proposed algorithm is able to provide good solutions.Keywords: ant colony optimization, CO2 emissions, combinatorial optimization, speed optimization, vehicle routing
Procedia PDF Downloads 32111930 Fossil Health: Causes and Consequences of Hegemonic Health Paradigms
Authors: Laila Vivas
Abstract:
Fossil Health is proposed as a value-concept to describe the hegemonic health paradigms that underpin health enactment. Such representation is justified by Foucaldian and related ideas on biopower and biosocialities, calling for the politicization of health and signalling the importance of narratives. This approach, hence, enables contemplating health paradigms as reflexive or co-constitutive of health itself or, in other words, conceiving health as a verb. Fossil health is a symbolic representation, influenced by Andreas Malm’s concept of fossil capitalism, that integrates environment and health as non-dichotomic areas. Fossil Health sustains that current notions of human and non-human health revolve around fossil fuel dependencies. Moreover, addressing disequilibria from established health ideals involves fossil-fixes. Fossil Health, therefore, represents causes and consequences of a health conception that has the agency to contribute to the functioning of a particular structural eco-social model. Moreover, within current capitalist relations, Fossil Health expands its meaning to cover not only fossil implications but also other dominant paradigms of the capitalist system that are (re)produced through health paradigms, such as the burgeoning of technoscience and biomedicalization, privatization of health, expertization of health, or the imposing of standards of uniformity. Overall, Fossil Health is a comprehensive approach to environment and health, where understanding hegemonic health paradigms means understanding our (human-non-human) nature paradigms and the structuring effect these narratives convey.Keywords: fossil health, environment, paradigm, capitalism
Procedia PDF Downloads 11911929 Prediction of Disability-Adjustment Mental Illness Using Machine Learning
Authors: S. R. M. Krishna, R. Santosh Kumar, V. Kamakshi Prasad
Abstract:
Machine learning techniques are applied for the analysis of the impact of mental illness on the burden of disease. It is calculated using the disability-adjusted life year (DALY). DALYs for a disease is the sum of years of life lost due to premature mortality (YLLs) + No of years of healthy life lost due to disability (YLDs). The critical analysis is done based on the Data sources, machine learning techniques and feature extraction method. The reviewing is done based on major databases. The extracted data is examined using statistical analysis and machine learning techniques were applied. The prediction of the impact of mental illness on the population using machine learning techniques is an alternative approach to the old traditional strategies, which are time-consuming and may not be reliable. The approach makes it necessary for a comprehensive adoption, innovative algorithms, and an understanding of the limitations and challenges. The obtained prediction is a way of understanding the underlying impact of mental illness on the health of the people and it enables us to get a healthy life expectancy. The growing impact of mental illness and the challenges associated with the detection and treatment of mental disorders make it necessary for us to understand the complete effect of it on the majority of the population. Procedia PDF Downloads 3411928 Second Order Cone Optimization Approach to Two-stage Network DEA
Authors: K. Asanimoghadam, M. Salahi, A. Jamalian
Abstract:
Data envelopment analysis is an approach to measure the efficiency of decision making units with multiple inputs and outputs. The structure of many decision making units also has decision-making subunits that are not considered in most data envelopment analysis models. Also, the inputs and outputs of the decision-making units usually are considered desirable, while in some real-world problems, the nature of some inputs or outputs are undesirable. In this thesis, we study the evaluation of the efficiency of two stage decision-making units, where some outputs are undesirable using two non-radial models, the SBM and the ASBM models. We formulate the nonlinear ASBM model as a second order cone optimization problem. Finally, we compare two models for both external and internal evaluation approaches for two real world example in the presence of undesirable outputs. The results show that, in both external and internal evaluations, the overall efficiency of ASBM model is greater than or equal to the overall efficiency value of the SBM model, and in internal evaluation, the ASBM model is more flexible than the SBM model.Keywords: network DEA, conic optimization, undesirable output, SBM
Procedia PDF Downloads 19311927 A Weighted Group EI Incorporating Role Information for More Representative Group EI Measurement
Authors: Siyu Wang, Anthony Ward
Abstract:
Emotional intelligence (EI) is a well-established personal characteristic. It has been viewed as a critical factor which can influence an individual's academic achievement, ability to work and potential to succeed. When working in a group, EI is fundamentally connected to the group members' interaction and ability to work as a team. The ability of a group member to intelligently perceive and understand own emotions (Intrapersonal EI), to intelligently perceive and understand other members' emotions (Interpersonal EI), and to intelligently perceive and understand emotions between different groups (Cross-boundary EI) can be considered as Group emotional intelligence (Group EI). In this research, a more representative Group EI measurement approach, which incorporates the information of the composition of a group and an individual’s role in that group, is proposed. To demonstrate the claim of being more representative Group EI measurement approach, this study adopts a multi-method research design, involving a combination of both qualitative and quantitative techniques to establish a metric of Group EI. From the results, it can be concluded that by introducing the weight coefficient of each group member on group work into the measurement of Group EI, Group EI will be more representative and more capable of understanding what happens during teamwork than previous approaches.Keywords: case study, emotional intelligence, group EI, multi-method research
Procedia PDF Downloads 12111926 An Interpretive Study of Entrepreneurial Experience towards Achieving Business Growth Using the Theory of Planned Behaviour as a Lens
Authors: Akunna Agunwah, Kevin Gallimore, Kathryn Kinmond
Abstract:
Entrepreneurship is widely associated and seen as a vehicle for economic growth; however, different scholars have studied entrepreneurship from various perspectives, resulting in multiple definitions. It is surprising to know most entrepreneurship definition does not incorporate growth as part of their definition of entrepreneurship. Economic growth is engineered by the activities of the entrepreneurs. The purpose of the present theoretical study is to explore the working practices of the successful entrepreneurs towards achieving business growth by understanding the experiences of the entrepreneur using the Theory of Planned Behaviour (TPB) as a lens. Ten successful entrepreneurs in the North West of England in various business sectors were interviewed using semi-structured interview method. The recorded audio interviews transcribed and subsequently evaluated using the thematic deductive technique (qualitative approach). The themes were examined using Theory of Planned Behaviour to ascertain the presence of the three intentional antecedents (attitude, subjective norms, and perceived behavioural control). The findings categorised in two folds, firstly, it was observed that the three intentional antecedents, which make up Theory of Planned Behaviour were evident in the transcript. Secondly, the entrepreneurs are most concerned with achieving a state of freedom and realising their visions and ambitions. Nevertheless, the entrepreneur employed these intentional antecedents to enhance business growth. In conclusion, the work presented here showed a novel way of understanding the working practices and experiences of the entrepreneur using the theory of planned behaviour in qualitative approach towards enhancing business growth. There exist few qualitative studies in entrepreneurship research. In addition, this work applies a novel approach to studying the experience of the entrepreneurs by examining the working practices of the successful entrepreneurs in the North-West England through the lens of the theory of planned behaviour. Given the findings regarding TPB as a lens in the study, the entrepreneur does not differentiate between the categories of the antecedents reasonably sees them as processes that can be utilised to enhance business growth.Keywords: business growth, experience, interpretive, theory of planned behaviour
Procedia PDF Downloads 21311925 Lean Thinking and E-Commerce as New Opportunities to Improve Partnership in Supply Chain of Construction Industries
Authors: Kaustav Kundu, Alberto Portioli Staudacher
Abstract:
Construction industry plays a vital role in the economy of the world. But due to high uncertainty and variability in the industry, its performance is not as efficient in terms of quality, lead times, productivity and costs as of other industries. Moreover, there are continuous conflicts among the different actors in the construction supply chains in terms of profit sharing. Previous studies suggested partnership as an important approach to promote cooperation among the different actors in the construction supply chains and thereby it improves the overall performance. Construction practitioners tried to focus on partnership which can enhance the performance of construction supply chains but they are not fully aware of different approaches and techniques for improving partnership. In this research, a systematic review on partnership in relation to construction supply chains is carried out to understand different elements influencing the partnership. The research development of this domain is analyzed by reviewing selected articles published from 1996 to 2015. Based on the papers, three major elements influencing partnership in construction supply chains are identified: “Lean approach”, “Relationship building” and “E-commerce applications”. This study analyses the contributions in the areas within each element and provides suggestions for future developments of partnership in construction supply chains.Keywords: partnership, construction, lean, SCM, supply chain management
Procedia PDF Downloads 43311924 The Reality of Engineering Education in the Kingdom of Saudi Arabia and Its Suitainability to The Requirements of The Labor Market
Authors: Hamad Albadr
Abstract:
With the development that has occurred in the orientation of universities from liability cognitive and maintain the culture of the community to responsibility job formation graduates to work according to the needs of the community development; representing universities in today's world, the prime motivator for the wheel of development in the community and find appropriate solutions to the problems they are facing and adapt to the demands of the changing environment. In this paper review of the reality of engineering education in the Kingdom of Saudi Arabia and its suitability to the requirements of the labor market, where they will be looking at the university as a system administrator educational using System Analysis Approach as one of the methods of modern management to analyze the performance of organizations and institutions, administrative and quality assessment. According to this approach is to deal with the system as a set of subsystems as components of the main divided into : input, process, and outputs, and the surrounding environment, will also be used research descriptive method and analytical , to gather information, data and analysis answers of the study population that consisting of a random sample of the beneficiaries of these services that the universities provided that about 500 professionals about employment in the business sector.Keywords: universities in Saudi Arabia, engineering education, labor market, administrative, quality assessment
Procedia PDF Downloads 33911923 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows
Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono
Abstract:
A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.Keywords: LES, multi-resolution, ENO, fortran
Procedia PDF Downloads 36411922 Ensemble of Deep CNN Architecture for Classifying the Source and Quality of Teff Cereal
Authors: Belayneh Matebie, Michael Melese
Abstract:
The study focuses on addressing the challenges in classifying and ensuring the quality of Eragrostis Teff, a small and round grain that is the smallest cereal grain. Employing a traditional classification method is challenging because of its small size and the similarity of its environmental characteristics. To overcome this, this study employs a machine learning approach to develop a source and quality classification system for Teff cereal. Data is collected from various production areas in the Amhara regions, considering two types of cereal (high and low quality) across eight classes. A total of 5,920 images are collected, with 740 images for each class. Image enhancement techniques, including scaling, data augmentation, histogram equalization, and noise removal, are applied to preprocess the data. Convolutional Neural Network (CNN) is then used to extract relevant features and reduce dimensionality. The dataset is split into 80% for training and 20% for testing. Different classifiers, including FVGG16, FINCV3, QSCTC, EMQSCTC, SVM, and RF, are employed for classification, achieving accuracy rates ranging from 86.91% to 97.72%. The ensemble of FVGG16, FINCV3, and QSCTC using the Max-Voting approach outperforms individual algorithms.Keywords: Teff, ensemble learning, max-voting, CNN, SVM, RF
Procedia PDF Downloads 5211921 Modelling Causal Effects from Complex Longitudinal Data via Point Effects of Treatments
Authors: Xiaoqin Wang, Li Yin
Abstract:
Background and purpose: In many practices, one estimates causal effects arising from a complex stochastic process, where a sequence of treatments are assigned to influence a certain outcome of interest, and there exist time-dependent covariates between treatments. When covariates are plentiful and/or continuous, statistical modeling is needed to reduce the huge dimensionality of the problem and allow for the estimation of causal effects. Recently, Wang and Yin (Annals of statistics, 2020) derived a new general formula, which expresses these causal effects in terms of the point effects of treatments in single-point causal inference. As a result, it is possible to conduct the modeling via point effects. The purpose of the work is to study the modeling of these causal effects via point effects. Challenges and solutions: The time-dependent covariates often have influences from earlier treatments as well as on subsequent treatments. Consequently, the standard parameters – i.e., the mean of the outcome given all treatments and covariates-- are essentially all different (null paradox). Furthermore, the dimension of the parameters is huge (curse of dimensionality). Therefore, it can be difficult to conduct the modeling in terms of standard parameters. Instead of standard parameters, we have use point effects of treatments to develop likelihood-based parametric approach to the modeling of these causal effects and are able to model the causal effects of a sequence of treatments by modeling a small number of point effects of individual treatment Achievements: We are able to conduct the modeling of the causal effects from a sequence of treatments in the familiar framework of single-point causal inference. The simulation shows that our method achieves not only an unbiased estimate for the causal effect but also the nominal level of type I error and a low level of type II error for the hypothesis testing. We have applied this method to a longitudinal study of COVID-19 mortality among Scandinavian countries and found that the Swedish approach performed far worse than the other countries' approach for COVID-19 mortality and the poor performance was largely due to its early measure during the initial period of the pandemic.Keywords: causal effect, point effect, statistical modelling, sequential causal inference
Procedia PDF Downloads 20511920 Integrated Approach of Quality Function Deployment, Sensitivity Analysis and Multi-Objective Linear Programming for Business and Supply Chain Programs Selection
Authors: T. T. Tham
Abstract:
The aim of this study is to propose an integrated approach to determine the most suitable programs, based on Quality Function Deployment (QFD), Sensitivity Analysis (SA) and Multi-Objective Linear Programming model (MOLP). Firstly, QFD is used to determine business requirements and transform them into business and supply chain programs. From the QFD, technical scores of all programs are obtained. All programs are then evaluated through five criteria (productivity, quality, cost, technical score, and feasibility). Sets of weight of these criteria are built using Sensitivity Analysis. Multi-Objective Linear Programming model is applied to select suitable programs according to multiple conflicting objectives under a budget constraint. A case study from the Sai Gon-Mien Tay Beer Company is given to illustrate the proposed methodology. The outcome of the study provides a comprehensive picture for companies to select suitable programs to obtain the optimal solution according to their preference.Keywords: business program, multi-objective linear programming model, quality function deployment, sensitivity analysis, supply chain management
Procedia PDF Downloads 12211919 A TFETI Domain Decompositon Solver for von Mises Elastoplasticity Model with Combination of Linear Isotropic-Kinematic Hardening
Authors: Martin Cermak, Stanislav Sysala
Abstract:
In this paper we present the efficient parallel implementation of elastoplastic problems based on the TFETI (Total Finite Element Tearing and Interconnecting) domain decomposition method. This approach allow us to use parallel solution and compute this nonlinear problem on the supercomputers and decrease the solution time and compute problems with millions of DOFs. In our approach we consider an associated elastoplastic model with the von Mises plastic criterion and the combination of linear isotropic-kinematic hardening law. This model is discretized by the implicit Euler method in time and by the finite element method in space. We consider the system of nonlinear equations with a strongly semismooth and strongly monotone operator. The semismooth Newton method is applied to solve this nonlinear system. Corresponding linearized problems arising in the Newton iterations are solved in parallel by the above mentioned TFETI. The implementation of this problem is realized in our in-house MatSol packages developed in MATLAB.Keywords: isotropic-kinematic hardening, TFETI, domain decomposition, parallel solution
Procedia PDF Downloads 41911918 Coupling Static Multiple Light Scattering Technique With the Hansen Approach to Optimize Dispersibility and Stability of Particle Dispersions
Authors: Guillaume Lemahieu, Matthias Sentis, Giovanni Brambilla, Gérard Meunier
Abstract:
Static Multiple Light Scattering (SMLS) has been shown to be a straightforward technique for the characterization of colloidal dispersions without dilution, as multiply scattered light in backscattered and transmitted mode is directly related to the concentration and size of scatterers present in the sample. In this view, the use of SMLS for stability measurement of various dispersion types has already been widely described in the literature. Indeed, starting from a homogeneous dispersion, the variation of backscattered or transmitted light can be attributed to destabilization phenomena, such as migration (sedimentation, creaming) or particle size variation (flocculation, aggregation). In a view to investigating more on the dispersibility of colloidal suspensions, an experimental set-up for “at the line” SMLS experiment has been developed to understand the impact of the formulation parameters on particle size and dispersibility. The SMLS experiment is performed with a high acquisition rate (up to 10 measurements per second), without dilution, and under direct agitation. Using such experimental device, SMLS detection can be combined with the Hansen approach to optimize the dispersing and stabilizing properties of TiO₂ particles. It appears that the dispersibility and the stability spheres generated are clearly separated, arguing that lower stability is not necessarily a consequence of poor dispersibility. Beyond this clarification, this combined SMLS-Hansen approach is a major step toward the optimization of dispersibility and stability of colloidal formulations by finding solvents having the best compromise between dispersing and stabilizing properties. Such study can be intended to find better dispersion media, greener and cheaper solvents to optimize particles suspensions, reduce the content of costly stabilizing additives or satisfy product regulatory requirements evolution in various industrial fields using suspensions (paints & inks, coatings, cosmetics, energy).Keywords: dispersibility, stability, Hansen parameters, particles, solvents
Procedia PDF Downloads 10711917 A Dynamical Approach for Relating Energy Consumption to Hybrid Inventory Level in the Supply Chain
Authors: Benga Ebouele, Thomas Tengen
Abstract:
Due to long lead time, work in process (WIP) inventory can manifest within the supply chain of most manufacturing system. It implies that there are lesser finished good on hand and more in the process because the work remains in the factory too long and cannot be sold to either customers The supply chain of most manufacturing system is then considered as inefficient as it take so much time to produce the finished good. Time consumed in each operation of the supply chain has an associated energy costs. Such phenomena can be harmful for a hybrid inventory system because a lot of space to store these semi-finished goods may be needed and one is not sure about the final energy cost of producing, holding and delivering the good to customers. The principle that reduces waste of energy within the supply chain of most manufacturing firms should therefore be available to all inventory managers in pursuit of profitability. Decision making by inventory managers in this condition is a modeling process, whereby a dynamical approach is used to depict, examine, specify and even operationalize the relationship between energy consumption and hybrid inventory level. The relationship between energy consumption and inventory level is established, which indicates a poor level of control and hence a potential for energy savings.Keywords: dynamic modelling, energy used, hybrid inventory, supply chain
Procedia PDF Downloads 26511916 Active Contours for Image Segmentation Based on Complex Domain Approach
Authors: Sajid Hussain
Abstract:
The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.Keywords: image segmentation, active contour, level set, Mumford and Shah model
Procedia PDF Downloads 11211915 High Performance Field Programmable Gate Array-Based Stochastic Low-Density Parity-Check Decoder Design for IEEE 802.3an Standard
Authors: Ghania Zerari, Abderrezak Guessoum, Rachid Beguenane
Abstract:
This paper introduces high-performance architecture for fully parallel stochastic Low-Density Parity-Check (LDPC) field programmable gate array (FPGA) based LDPC decoder. The new approach is designed to decrease the decoding latency and to reduce the FPGA logic utilisation. To accomplish the target logic utilisation reduction, the routing of the proposed sub-variable node (VN) internal memory is designed to utilize one slice distributed RAM. Furthermore, a VN initialization, using the channel input probability, is achieved to enhance the decoder convergence, without extra resources and without integrating the output saturated-counters. The Xilinx FPGA implementation, of IEEE 802.3an standard LDPC code, shows that the proposed decoding approach attain high performance along with reduction of FPGA logic utilisation.Keywords: low-density parity-check (LDPC) decoder, stochastic decoding, field programmable gate array (FPGA), IEEE 802.3an standard
Procedia PDF Downloads 295