Search results for: information seeking models
16155 A Comparative Analysis of Innovation Maturity Models: Towards the Development of a Technology Management Maturity Model
Authors: Nikolett Deutsch, Éva Pintér, Péter Bagó, Miklós Hetényi
Abstract:
Strategic technology management has emerged and evolved parallelly with strategic management paradigms. It focuses on the opportunity for organizations operating mainly in technology-intensive industries to explore and exploit technological capabilities upon which competitive advantage can be obtained. As strategic technology management involves multifunction within an organization, requires broad and diversified knowledge, and must be developed and implemented with business objectives to enable a firm’s profitability and growth, excellence in strategic technology management provides unique opportunities for organizations in terms of building a successful future. Accordingly, a framework supporting the evaluation of the technological readiness level of management can significantly contribute to developing organizational competitiveness through a better understanding of strategic-level capabilities and deficiencies in operations. In the last decade, several innovation maturity assessment models have appeared and become designated management tools that can serve as references for future practical approaches expected to be used by corporate leaders, strategists, and technology managers to understand and manage technological capabilities and capacities. The aim of this paper is to provide a comprehensive review of the state-of-the-art innovation maturity frameworks, to investigate the critical lessons learned from their application, to identify the similarities and differences among the models, and identify the main aspects and elements valid for the field and critical functions of technology management. To this end, a systematic literature review was carried out considering the relevant papers and articles published in highly ranked international journals around the 27 most widely known innovation maturity models from four relevant digital sources. Key findings suggest that despite the diversity of the given models, there is still room for improvement regarding the common understanding of innovation typologies, the full coverage of innovation capabilities, and the generalist approach to the validation and practical applicability of the structure and content of the models. Furthermore, the paper proposes an initial structure by considering the maturity assessment of the technological capacities and capabilities - i.e., technology identification, technology selection, technology acquisition, technology exploitation, and technology protection - covered by strategic technology management.Keywords: innovation capabilities, innovation maturity models, technology audit, technology management, technology management maturity models
Procedia PDF Downloads 6116154 Comparing Spontaneous Hydrolysis Rates of Activated Models of DNA and RNA
Authors: Mohamed S. Sasi, Adel M. Mlitan, Abdulfattah M. Alkherraz
Abstract:
This research project aims to investigate difference in relative rates concerning phosphoryl transfer relevant to biological catalysis of DNA and RNA in the pH-independent reactions. Activated Models of DNA and RNA for alkyl-aryl phosphate diesters (with 4-nitrophenyl as a good leaving group) have successfully been prepared to gather kinetic parameters. Eyring plots for the pH–independent hydrolysis of 1 and 2 were established at different temperatures in the range 100–160 °C. These measurements have been used to provide a better estimate for the difference in relative rates between the reactivity of DNA and RNA cleavage. Eyring plot gave an extrapolated rate of kH2O = 1 × 10-10 s -1 for 1 (RNA model) and 2 (DNA model) at 25°C. Comparing the reactivity of RNA model and DNA model shows that the difference in relative rates in the pH-independent reactions is surprisingly very similar at 25°. This allows us to obtain chemical insights into how biological catalysts such as enzymes may have evolved to perform their current functions.Keywords: DNA and RNA models, relative rates, reactivity, phosphoryl transfe
Procedia PDF Downloads 42316153 Block N Lvi from the Northern Side of Parthenon Frieze: A Case Study of Augmented Reality for Museum Application
Authors: Donato Maniello, Alessandra Cirafici, Valeria Amoretti
Abstract:
This paper aims to present a new method that consists in the use of video mapping techniques – that is a particular form of augmented reality, which could produce new tools - different from the ones that are actually in use - for an interactive Museum experience. With the words 'augmented reality', we mean the addition of more information than what the visitor would normally perceive; this information is mediated by the use of computer and projector. The proposed application involves the creation of a documentary that depicts and explains the history of the artifact and illustrates its features; this must be projected on the surface of the faithful copy of the freeze (obtained in full-scale with a 3D printer). This mode of operation uses different techniques that allow passing from the creation of the model to the creation of contents through an accurate historical and artistic analysis, and finally to the warping phase, that will permit to overlap real and virtual models. The ultimate step, that is still being studied, includes the creation of interactive contents that would be activated by visitors through appropriate motion sensors.Keywords: augmented reality, multimedia, parthenon frieze, video mapping
Procedia PDF Downloads 38716152 Quantum Kernel Based Regressor for Prediction of Non-Markovianity of Open Quantum Systems
Authors: Diego Tancara, Raul Coto, Ariel Norambuena, Hoseein T. Dinani, Felipe Fanchini
Abstract:
Quantum machine learning is a growing research field that aims to perform machine learning tasks assisted by a quantum computer. Kernel-based quantum machine learning models are paradigmatic examples where the kernel involves quantum states, and the Gram matrix is calculated from the overlapping between these states. With the kernel at hand, a regular machine learning model is used for the learning process. In this paper we investigate the quantum support vector machine and quantum kernel ridge models to predict the degree of non-Markovianity of a quantum system. We perform digital quantum simulation of amplitude damping and phase damping channels to create our quantum dataset. We elaborate on different kernel functions to map the data and kernel circuits to compute the overlapping between quantum states. We observe a good performance of the models.Keywords: quantum, machine learning, kernel, non-markovianity
Procedia PDF Downloads 18016151 Exploring Alignability Effects and the Role of Information Structure in Promoting Uptake of Energy Efficient Technologies
Authors: Rebecca Hafner, David Elmes, Daniel Read
Abstract:
The current research applies decision-making theory to the problem of increasing uptake of energy efficient technologies in the market place, where uptake is currently slower than one might predict following rational choice models. We apply the alignable/non-alignable features effect and explore the impact of varying information structure on the consumers’ preference for standard versus energy efficient technologies. In two studies we present participants with a choice between similar (boiler vs. boiler) vs. dissimilar (boiler vs. heat pump) technologies, described by a list of alignable and non-alignable attributes. In study One there is a preference for alignability when options are similar; an effect mediated by an increased tendency to infer missing information is the same. No effects of alignability on preference are found when options differ. One explanation for this split-shift in attentional focus is a change in construal levels potentially induced by the added consideration of environmental concern. Study two was designed to explore the interplay between alignability and construal level in greater detail. We manipulated construal level via a thought prime task prior to taking part in the same heating systems choice task, and find that there is a general preference for non-alignability, regardless of option type. We draw theoretical and applied implications for the type of information structure best suited for the promotion of energy efficient technologies.Keywords: alignability effects, decision making, energy-efficient technologies, sustainable behaviour change
Procedia PDF Downloads 31316150 Predicting Stem Borer Density in Maize Using RapidEye Data and Generalized Linear Models
Authors: Elfatih M. Abdel-Rahman, Tobias Landmann, Richard Kyalo, George Ong’amo, Bruno Le Ru
Abstract:
Maize (Zea mays L.) is a major staple food crop in Africa, particularly in the eastern region of the continent. The maize growing area in Africa spans over 25 million ha and 84% of rural households in Africa cultivate maize mainly as a means to generate food and income. Average maize yields in Sub Saharan Africa are 1.4 t/ha as compared to global average of 2.5–3.9 t/ha due to biotic and abiotic constraints. Amongst the biotic production constraints in Africa, stem borers are the most injurious. In East Africa, yield losses due to stem borers are currently estimated between 12% to 40% of the total production. The objective of the present study was therefore to predict stem borer larvae density in maize fields using RapidEye reflectance data and generalized linear models (GLMs). RapidEye images were captured for a test site in Kenya (Machakos) in January and in February 2015. Stem borer larva numbers were modeled using GLMs assuming Poisson (Po) and negative binomial (NB) distributions with error with log arithmetic link. Root mean square error (RMSE) and ratio prediction to deviation (RPD) statistics were employed to assess the models performance using a leave one-out cross-validation approach. Results showed that NB models outperformed Po ones in all study sites. RMSE and RPD ranged between 0.95 and 2.70, and between 2.39 and 6.81, respectively. Overall, all models performed similar when used the January and the February image data. We conclude that reflectance data from RapidEye data can be used to estimate stem borer larvae density. The developed models could to improve decision making regarding controlling maize stem borers using various integrated pest management (IPM) protocols.Keywords: maize, stem borers, density, RapidEye, GLM
Procedia PDF Downloads 49716149 Adaptive Online Object Tracking via Positive and Negative Models Matching
Authors: Shaomei Li, Yawen Wang, Chao Gao
Abstract:
To improve tracking drift which often occurs in adaptive tracking, an algorithm based on the fusion of tracking and detection is proposed in this paper. Firstly, object tracking is posed as a binary classification problem and is modeled by partial least squares (PLS) analysis. Secondly, tracking object frame by frame via particle filtering. Thirdly, validating the tracking reliability based on both positive and negative models matching. Finally, relocating the object based on SIFT features matching and voting when drift occurs. Object appearance model is updated at the same time. The algorithm cannot only sense tracking drift but also relocate the object whenever needed. Experimental results demonstrate that this algorithm outperforms state-of-the-art algorithms on many challenging sequences.Keywords: object tracking, tracking drift, partial least squares analysis, positive and negative models matching
Procedia PDF Downloads 52916148 Dry Relaxation Shrinkage Prediction of Bordeaux Fiber Using a Feed Forward Neural
Authors: Baeza S. Roberto
Abstract:
The knitted fabric suffers a deformation in its dimensions due to stretching and tension factors, transverse and longitudinal respectively, during the process in rectilinear knitting machines so it performs a dry relaxation shrinkage procedure and thermal action of prefixed to obtain stable conditions in the knitting. This paper presents a dry relaxation shrinkage prediction of Bordeaux fiber using a feed forward neural network and linear regression models. Six operational alternatives of shrinkage were predicted. A comparison of the results was performed finding neural network models with higher levels of explanation of the variability and prediction. The presence of different reposes are included. The models were obtained through a neural toolbox of Matlab and Minitab software with real data in a knitting company of Southern Guanajuato. The results allow predicting dry relaxation shrinkage of each alternative operation.Keywords: neural network, dry relaxation, knitting, linear regression
Procedia PDF Downloads 58516147 The Impact of the Information Technologies on the Accounting Department of the Romanian Companies
Authors: Dumitru Valentin Florentin
Abstract:
The need to use high volumes of data and the high competition are only two reasons which make necessary the use of information technologies. The objective of our research is to establish the impact of information technologies on the accounting department of the Romanian companies. In order to achieve it, starting from the literature review we made an empirical research based on a questionnaire. We investigated the types of technologies used, the reasons which led to the implementation of certain technologies, the benefits brought by the use of the information technologies, the difficulties brought by the implementation and the future effects of the applications. The conclusions show that there is an evolution in the degree of implementation of the information technologies in the Romanian companies, compared with the results of other studies conducted a few years before.Keywords: information technologies, impact, company, Romania, empirical study
Procedia PDF Downloads 42416146 Comparison of Sourcing Process in Supply Chain Operation References Model and Business Information Systems
Authors: Batuhan Kocaoglu
Abstract:
Although using powerful systems like ERP (Enterprise Resource Planning), companies still cannot benchmark their processes and measure their process performance easily based on predefined SCOR (Supply Chain Operation References) terms. The purpose of this research is to identify common and corresponding processes to present a conceptual model to model and measure the purchasing process of an organization. The main steps for the research study are: Literature review related to 'procure to pay' process in ERP system; Literature review related to 'sourcing' process in SCOR model; To develop a conceptual model integrating 'sourcing' of SCOR model and 'procure to pay' of ERP model. In this study, we examined the similarities and differences between these two models. The proposed framework is based on the assumptions that are drawn from (1) the body of literature, (2) the authors’ experience by working in the field of enterprise and logistics information systems. The modeling framework provides a structured and systematic way to model and decompose necessary information from conceptual representation to process element specification. This conceptual model will help the organizations to make quality purchasing system measurement instruments and tools. And offered adaptation issues for ERP systems and SCOR model will provide a more benchmarkable and worldwide standard business process.Keywords: SCOR, ERP, procure to pay, sourcing, reference model
Procedia PDF Downloads 36216145 An Information Matrix Goodness-of-Fit Test of the Conditional Logistic Model for Matched Case-Control Studies
Authors: Li-Ching Chen
Abstract:
The case-control design has been widely applied in clinical and epidemiological studies to investigate the association between risk factors and a given disease. The retrospective design can be easily implemented and is more economical over prospective studies. To adjust effects for confounding factors, methods such as stratification at the design stage and may be adopted. When some major confounding factors are difficult to be quantified, a matching design provides an opportunity for researchers to control the confounding effects. The matching effects can be parameterized by the intercepts of logistic models and the conditional logistic regression analysis is then adopted. This study demonstrates an information-matrix-based goodness-of-fit statistic to test the validity of the logistic regression model for matched case-control data. The asymptotic null distribution of this proposed test statistic is inferred. It needs neither to employ a simulation to evaluate its critical values nor to partition covariate space. The asymptotic power of this test statistic is also derived. The performance of the proposed method is assessed through simulation studies. An example of the real data set is applied to illustrate the implementation of the proposed method as well.Keywords: conditional logistic model, goodness-of-fit, information matrix, matched case-control studies
Procedia PDF Downloads 29216144 Convective Hot Air Drying of Different Varieties of Blanched Sweet Potato Slices
Authors: M. O. Oke, T. S. Workneh
Abstract:
Drying behaviour of blanched sweet potato in a cabinet dryer using different five air temperatures (40-80oC) and ten sweet potato varieties sliced to 5 mm thickness were investigated. The drying data were fitted to eight models. The Modified Henderson and Pabis model gave the best fit to the experimental moisture ratio data obtained during the drying of all the varieties while Newton (Lewis) and Wang and Singh models gave the least fit. The values of Deff obtained for Bophelo variety (1.27 x 10-9 to 1.77 x 10-9 m2/s) was the least while that of S191 (1.93 x 10-9 to 2.47 x 10-9 m2/s) was the highest which indicates that moisture diffusivity in sweet potato is affected by the genetic factor. Activation energy values ranged from 0.27-6.54 kJ/mol. The lower activation energy indicates that drying of sweet potato slices requires less energy and is hence a cost and energy saving method. The drying behavior of blanched sweet potato was investigated in a cabinet dryer. Drying time decreased considerably with increase in hot air temperature. Out of the eight models fitted, the Modified Henderson and Pabis model gave the best fit to the experimental moisture ratio data on all the varieties while Newton, Wang and Singh models gave the least. The lower activation energy (0.27-6.54 kJ/mol) obtained indicates that drying of sweet potato slices requires less energy and is hence a cost and energy saving method.Keywords: sweet potato slice, drying models, moisture ratio, moisture diffusivity, activation energy
Procedia PDF Downloads 51716143 Green Accounting and Firm Performance: A Bibliometric Literature Review
Authors: Francesca di Donato, Sara Trucco
Abstract:
Green accounting is a growing topic of interest. Indeed, nowadays, most firms affect the environment; therefore, companies are seeking the best way to disclose environmental information. Furthermore, companies are increasingly committed to improving the environment, and the topic is gaining more importance to the public, governments, and policymakers. Green accounting is a type of accounting that considers environmental costs and their impact on the financial performance of firms. Thus, the motivation of the current research is to investigate the state-of-the-art literature on the relationship between green accounting and firm performance since the birth of the topic of green accounting and to investigate gaps in the literature that represent fruitful terrain for future research. In doing so, this study provides a bibliometric literature review of existing evidence related to the link between green accounting and firm performance since 2000. The search, based on the most relevant databases for scientific journals (which are Scopus, Emerald, Web of Science, Google Scholar, and Econlit), returned 1917 scientific articles. The articles were manually reviewed in order to identify only the relevant studies in the field by excluding articles with titles and abstracts out of scope. The final sample was composed of 107 articles. A content analysis was carried out on the final sample of articles; in doing so, a classification system has been proposed. Findings show the most relevant environmental costs and issues considered in previous studies and how green accounting may be linked to the financial and non-financial performance of a firm. The study also offers suggestions for future research in this domain. This study has several practical implications. Indeed, the topic of green accounting may be applied to different sectors and different types of companies. Therefore, this study may help managers to better understand the most relevant environmental information to disclose and how environmental issues may be managed to improve the performance of the firms. Moreover, the bibliometric literature review may be of interest to those stakeholders who are interested in the historical evolution of the topic.Keywords: bibliometric literature review, firm performance, green accounting, literature review
Procedia PDF Downloads 6916142 Visual Analytics of Higher Order Information for Trajectory Datasets
Authors: Ye Wang, Ickjai Lee
Abstract:
Due to the widespread of mobile sensing, there is a strong need to handle trails of moving objects, trajectories. This paper proposes three visual analytic approaches for higher order information of trajectory data sets based on the higher order Voronoi diagram data structure. Proposed approaches reveal geometrical information, topological, and directional information. Experimental results demonstrate the applicability and usefulness of proposed three approaches.Keywords: visual analytics, higher order information, trajectory datasets, spatio-temporal data
Procedia PDF Downloads 40216141 The Changing Face of Pedagogy and Curriculum Development Sub-Components of Teacher Education in Nigeria: A Comparative Evaluation of the University of Lagos, Lagos State University, and Sokoto State University Models
Authors: Saheed A. Rufai
Abstract:
Courses in Pedagogy and Curriculum Development expectedly occupy a core place in the professional education components of teacher education at Lagos, Lagos State, and Sokoto State Universities. This is in keeping with the National Teacher Education Policy statement that stipulates that for student teachers to learn effectively teacher education institutions must be equipped to prepare them adequately. However, there is a growing concern over the unfaithfulness of some of the dominant Nigerian models of teacher education, to this policy statement on teacher educators’ knowledge and skills. The purpose of this paper is to comparatively evaluate both the curricular provisions and the manpower for the pedagogy and curriculum development sub-components of the Lagos, Lagos State, and Sokoto State models of teacher preparation. The paper employs a combination of quantitative and qualitative methods. Preliminary analysis revealed a new trend in teacher educators’ pedagogical knowledge and understanding, with regard to the two intertwined sub-components. The significance of such a study lies in its potential to determine the degree of conformity of each of the three models to the stipulated standards. The paper’s contribution to scholarship lies in its correlation of deficiencies in teacher educators’ professional knowledge and skills and articulation of the implications of such deficiencies for the professional knowledge and skills of the prospective teachers, with a view to providing a framework for reforms.Keywords: curriculum development, pedagogy, teacher education, dominant Nigerian teacher preparation models
Procedia PDF Downloads 44316140 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 33916139 Arabic Light Word Analyser: Roles with Deep Learning Approach
Authors: Mohammed Abu Shquier
Abstract:
This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN
Procedia PDF Downloads 4216138 Value Relevance of Accounting Information: Empirical Evidence from China
Authors: Ying Guo, Miaochan Li, David Yang, Xiao-Yan Li
Abstract:
This paper examines the relevance of accounting information to stock prices at different periods using manufacturing companies listed in China’s Growth Enterprise Market (GEM). We find that both the average stock price at fiscal year-end and the average stock price one month after fiscal year-end are more relevant to the accounting information than the closing stock price four months after fiscal year-end. This implies that Chinese stock markets react before the public disclosure of accounting information, which may be due to information leak before official announcements. Our findings confirm that accounting information is relevant to stock prices for Chinese listed manufacturing companies, which is a critical question to answer for investors who have interest in Chinese companies.Keywords: accounting information, response time, value relevance, stock price
Procedia PDF Downloads 9616137 Modeling and Shape Prediction for Elastic Kinematic Chains
Authors: Jiun Jeon, Byung-Ju Yi
Abstract:
This paper investigates modeling and shape prediction of elastic kinematic chains such as colonoscopy. 2D and 3D models of elastic kinematic chains are suggested and their behaviors are demonstrated through simulation. To corroborate the effectiveness of those models, experimental work is performed using a magnetic sensor system.Keywords: elastic kinematic chain, shape prediction, colonoscopy, modeling
Procedia PDF Downloads 60516136 Phishing Attacks Facilitated by Open Source Intelligence
Authors: Urva Maryam
Abstract:
Information has become an important asset to the current cosmos. Globally, various tactics are being observed to confine the spread of information as it makes people vulnerable to security attacks. Open Source Intelligence (OSINT) is a publicly available source that has disseminated information about users or website, companies, and various organizations. This paper focuses on the quantitative method of exploring various OSINT tools that reveal public information of personals. This information could further facilitate the phishing attacks. Phishing attacks can be launched on email addresses, open ports, and unsecured web-surfing. This study allows to analyze information retrieved from OSINT tools i.e., the Harvester, and Maltego, that can be used to send phishing attacks to individuals.Keywords: OSINT, phishing, spear phishing, email spoofing, the harvester, maltego
Procedia PDF Downloads 8116135 Managing Information Technology: An Overview of Information Technology Governance
Authors: Mehdi Asgarkhani
Abstract:
Today, investment on Information Technology (IT) solutions in most organizations is the largest component of capital expenditure. As capital investment on IT continues to grow, IT managers and strategists are expected to develop and put in practice effective decision making models (frameworks) that improve decision-making processes for the use of IT in organizations and optimize the investment on IT solutions. To be exact, there is an expectation that organizations not only maximize the benefits of adopting IT solutions but also avoid the many pitfalls that are associated with rapid introduction of technological change. Different organizations depending on size, complexity of solutions required and processes used for financial management and budgeting may use different techniques for managing strategic investment on IT solutions. Decision making processes for strategic use of IT within organizations are often referred to as IT Governance (or Corporate IT Governance). This paper examines IT governance - as a tool for best practice in decision making about IT strategies. Discussions in this paper represent phase I of a project which was initiated to investigate trends in strategic decision making on IT strategies. Phase I is concerned mainly with review of literature and a number of case studies, establishing that the practice of IT governance, depending on the complexity of IT solutions, organization's size and organization's stage of maturity, varies significantly – from informal approaches to sophisticated formal frameworks.Keywords: IT governance, corporate governance, IT governance frameworks, IT governance components, aligning IT with business strategies
Procedia PDF Downloads 40616134 The Models of Character Development Bali Police to Improve Quality of Moral Members in Bali Police Headquarters
Authors: Agus Masrukhin
Abstract:
This research aims to find and analyze the model of character building in the Police Headquarters in Bali with a case study of Muslim members in improving the quality of the morality of its members. The formation of patterns of thinking, behavior, mentality, and police officers noble character, later can be used as a solution to reduce the hedonistic nature of the challenges in the era of globalization. The benefit of this study is expected to be a positive recommendation to find a constructive character building models of police officers in the Republic of Indonesia, especially Bali Police. For the long term, the discovery of the character building models can be developed for the entire police force in Indonesia. The type of research that would apply in this study researchers mix the qualitative research methods based on the narrative between the subject and the concrete experience of field research and quantitative research methods with 92 respondents from the police regional police Bali. This research used a descriptive analysis and SWOT analysis then it is presented in the FGD (focus group discussion). The results of this research indicate that the variable modeling the leadership of the police and variable police offices culture have significant influence on the implementation of spiritual development.Keywords: positive constructive, hedonistic, character models, morality
Procedia PDF Downloads 36516133 Comparative Mesh Sensitivity Study of Different Reynolds Averaged Navier Stokes Turbulence Models in OpenFOAM
Authors: Zhuoneng Li, Zeeshan A. Rana, Karl W. Jenkins
Abstract:
In industry, to validate a case, often a multitude of simulation are required and in order to demonstrate confidence in the process where users tend to use a coarser mesh. Therefore, it is imperative to establish the coarsest mesh that could be used while keeping reasonable simulation accuracy. To date, the two most reliable, affordable and broadly used advanced simulations are the hybrid RANS (Reynolds Averaged Navier Stokes)/LES (Large Eddy Simulation) and wall modelled LES. The potentials in these two simulations will still be developed in the next decades mainly because the unaffordable computational cost of a DNS (Direct Numerical Simulation). In the wall modelled LES, the turbulence model is applied as a sub-grid scale model in the most inner layer near the wall. The RANS turbulence models cover the entire boundary layer region in a hybrid RANS/LES (Detached Eddy Simulation) and its variants, therefore, the RANS still has a very important role in the state of art simulations. This research focuses on the turbulence model mesh sensitivity analysis where various turbulence models such as the S-A (Spalart-Allmaras), SSG (Speziale-Sarkar-Gatski), K-Omega transitional SST (Shear Stress Transport), K-kl-Omega, γ-Reθ transitional model, v2f are evaluated within the OpenFOAM. The simulations are conducted on a fully developed turbulent flow over a flat plate where the skin friction coefficient as well as velocity profiles are obtained to compare against experimental values and DNS results. A concrete conclusion is made to clarify the mesh sensitivity for different turbulence models.Keywords: mesh sensitivity, turbulence models, OpenFOAM, RANS
Procedia PDF Downloads 26116132 The Anti-Cyber and Information Technology Crimes Law on Information Access and Dissemination by Egyptian Journalists
Authors: Miral Sabry AlAshry
Abstract:
The main objective of the study is to investigate the effectiveness of Egyptian Journalists through the Anti-Cyber and Information Technology Crimes Law, as well as its implications for journalistic practice and the implications for press freedom in Egypt. Questionnaires were undertaken with 192 journalists representing four official newspapers, and in-depth interviews were held with 15 journalists. The study used an Authoritarian theory as a theoretical framework. The study revealed that the government placed restrictions on journalists by using the law to oppress them.Keywords: anti-cyber and information technology crimes law, media legislation, personal information, Egyptian constitution
Procedia PDF Downloads 37316131 Information Exchange Process Analysis between Authoring Design Tools and Lighting Simulation Tools
Authors: Rudan Xue, Annika Moscati, Rehel Zeleke Kebede, Peter Johansson
Abstract:
Successful buildings’ simulation and analysis inevitably require information exchange between multiple building information modeling (BIM) software. The BIM infor-mation exchange based on IFC is widely used. However, Industry Foundation Classifi-cation (IFC) files are not always reliable and information can get lost when using dif-ferent software for modeling and simulations. In this research, interviews with lighting simulation experts and a case study provided by a company producing lighting devices have been the research methods used to identify the necessary steps and data for suc-cessful information exchange between lighting simulation tools and authoring design tools. Model creation, information exchange, and model simulation have been identi-fied as key aspects for the success of information exchange. The paper concludes with recommendations for improved information exchange and more reliable simulations that take all the needed parameters into consideration.Keywords: BIM, data exchange, interoperability issues, lighting simulations
Procedia PDF Downloads 23916130 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources
Authors: Mustafa Alhamdi
Abstract:
Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification
Procedia PDF Downloads 15016129 Statistical Analysis and Impact Forecasting of Connected and Autonomous Vehicles on the Environment: Case Study in the State of Maryland
Authors: Alireza Ansariyar, Safieh Laaly
Abstract:
Over the last decades, the vehicle industry has shown increased interest in integrating autonomous, connected, and electrical technologies in vehicle design with the primary hope of improving mobility and road safety while reducing transportation’s environmental impact. Using the State of Maryland (M.D.) in the United States as a pilot study, this research investigates CAVs’ fuel consumption and air pollutants (C.O., PM, and NOx) and utilizes meaningful linear regression models to predict CAV’s environmental effects. Maryland transportation network was simulated in VISUM software, and data on a set of variables were collected through a comprehensive survey. The number of pollutants and fuel consumption were obtained for the time interval 2010 to 2021 from the macro simulation. Eventually, four linear regression models were proposed to predict the amount of C.O., NOx, PM pollutants, and fuel consumption in the future. The results highlighted that CAVs’ pollutants and fuel consumption have a significant correlation with the income, age, and race of the CAV customers. Furthermore, the reliability of four statistical models was compared with the reliability of macro simulation model outputs in the year 2030. The error of three pollutants and fuel consumption was obtained at less than 9% by statistical models in SPSS. This study is expected to assist researchers and policymakers with planning decisions to reduce CAV environmental impacts in M.D.Keywords: connected and autonomous vehicles, statistical model, environmental effects, pollutants and fuel consumption, VISUM, linear regression models
Procedia PDF Downloads 44516128 A Risk-Based Modeling Approach for Successful Adoption of CAATTs in Audits: An Exploratory Study Applied to Israeli Accountancy Firms
Authors: Alon Cohen, Jeffrey Kantor, Shalom Levy
Abstract:
Technology adoption models are extensively used in the literature to explore drivers and inhibitors affecting the adoption of Computer Assisted Audit Techniques and Tools (CAATTs). Further studies from recent years suggested additional factors that may affect technology adoption by CPA firms. However, the adoption of CAATTs by financial auditors differs from the adoption of technologies in other industries. This is a result of the unique characteristics of the auditing process, which are expressed in the audit risk elements and the risk-based auditing approach, as encoded in the auditing standards. Since these audit risk factors are not part of the existing models that are used to explain technology adoption, these models do not fully correspond to the specific needs and requirements of the auditing domain. The overarching objective of this qualitative research is to fill the gap in the literature, which exists as a result of using generic technology adoption models. Followed by a pretest and based on semi-structured in-depth interviews with 16 Israeli CPA firms of different sizes, this study aims to reveal determinants related to audit risk factors that influence the adoption of CAATTs in audits and proposes a new modeling approach for the successful adoption of CAATTs. The findings emphasize several important aspects: (1) while large CPA firms developed their own inner guidelines to assess the audit risk components, other CPA firms do not follow a formal and validated methodology to evaluate these risks; (2) large firms incorporate a variety of CAATTs, including self-developed advanced tools. On the other hand, small and mid-sized CPA firms incorporate standard CAATTs and still need to catch up to better understand what CAATTs can offer and how they can contribute to the quality of the audit; (3) the top management of mid-sized and small CPA firms should be more proactive and updated about CAATTs capabilities and contributions to audits; and (4) All CPA firms consider professionalism as a major challenge that must be constantly managed to ensure an optimal CAATTs operation. The study extends the existing knowledge of CAATTs adoption by looking at it from a risk-based auditing approach. It suggests a new model for CAATTs adoption by incorporating influencing audit risk factors that auditors should examine when considering CAATTs adoption. Since the model can be used in various audited scenarios and supports strategic, risk-based decisions, it maximizes the great potential of CAATTs on the quality of the audits. The results and insights can be useful to CPA firms, internal auditors, CAATTs developers and regulators. Moreover, it may motivate audit standard-setters to issue updated guidelines regarding CAATTs adoption in audits.Keywords: audit risk, CAATTs, financial auditing, information technology, technology adoption models
Procedia PDF Downloads 6716127 Automatic Diagnosis of Electrical Equipment Using Infrared Thermography
Authors: Y. Laib Dit Leksir, S. Bouhouche
Abstract:
Analysis and processing of data bases resulting from infrared thermal measurements made on the electrical installation requires the development of new tools in order to obtain correct and additional information to the visual inspections. Consequently, the methods based on the capture of infrared digital images show a great potential and are employed increasingly in various fields. Although, there is an enormous need for the development of effective techniques to analyse these data base in order to extract relevant information relating to the state of the equipments. Our goal consists in introducing recent techniques of modeling based on new methods, image and signal processing to develop mathematical models in this field. The aim of this work is to capture the anomalies existing in electrical equipments during an inspection of some machines using A40 Flir camera. After, we use binarisation techniques in order to select the region of interest and we make comparison between these methods of thermal images obtained to choose the best one.Keywords: infrared thermography, defect detection, troubleshooting, electrical equipment
Procedia PDF Downloads 47616126 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 158