Search results for: Malaysian market error
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5513

Search results for: Malaysian market error

1703 Achieving Competitive Advantage Through Internal Resources and Competences

Authors: Ibrahim Alkandi

Abstract:

This study aims at understanding how banks can utilize their resources and capabilities to achieve a competitive advantage. The resource-based approach has been applied to assess the resources and capabilities as well as how the management perceives them as sources of competitive advantages. A quantitative approach was implemented using cross-sectional data. The research population consisted of Top managers in financial companies in Saudi Arabia, and the sample comprised 79 managers. The resources were sub divided into tangible and intangible. Among the variables that will be assessed in the research include propriety rights, trademark which is the brand, communication as well as organizational culture. To achieve the objective of the research, Multivariate analysis through multiple regression was used. The research tool used is a questionnaire whose validity is also assessed. According to the results of the study, there is a significant relationship between bank’s performance and the strategic management of propriety rights, trademark, administrative and financial skills as well as bank culture. Therefore, the research assessed four aspects, among the variables in the model, in relation to the strategic performance of these banks. The aspects considered were trademark, communication, administrative and leadership style as well as the company’s culture. Hence, this paper contributes to the body of literature by providing empirical evidence of the resources influencing both banks’ market and economic performance.

Keywords: competitive advantage, Saudi banks, strategic management, RBV

Procedia PDF Downloads 75
1702 Application of the Global Optimization Techniques to the Optical Thin Film Design

Authors: D. Li

Abstract:

Optical thin films are used in a wide variety of optical components and there are many software tools programmed for advancing multilayer thin film design. The available software packages for designing the thin film structure may not provide optimum designs. Normally, almost all current software programs obtain their final designs either from optimizing a starting guess or by technique, which may or may not involve a pseudorandom process, that give different answers every time, depending upon the initial conditions. With the increasing power of personal computers, functional methods in optimization and synthesis of optical multilayer systems have been developed such as DGL Optimization, Simulated Annealing, Genetic Algorithms, Needle Optimization, Inductive Optimization and Flip-Flop Optimization. Among these, DGL Optimization has proved its efficiency in optical thin film designs. The application of the DGL optimization technique to the design of optical coating is presented. A DGL optimization technique is provided, and its main features are discussed. Guidelines on the application of the DGL optimization technique to various types of design problems are given. The innovative global optimization strategies used in a software tool, OnlyFilm, to optimize multilayer thin film designs through different filter designs are outlined. OnlyFilm is a powerful, versatile, and user-friendly thin film software on the market, which combines optimization and synthesis design capabilities with powerful analytical tools for optical thin film designers. It is also the only thin film design software that offers a true global optimization function.

Keywords: optical coatings, optimization, design software, thin film design

Procedia PDF Downloads 316
1701 Leveraging Hyperledger Iroha for the Issuance and Verification of Higher-Education Certificates

Authors: Vasiliki Vlachou, Christos Kontzinos, Ourania Markaki, Panagiotis Kokkinakos, Vagelis Karakolis, John Psarras

Abstract:

Higher Education is resisting the pull of technology, especially as this concerns the issuance and verification of degrees and certificates. It is widely known that education certificates are largely produced in paper form making them vulnerable to damage while holders of such certificates are dependent on the universities and other issuing organisations. QualiChain is an EU Horizon 2020 (H2020) research project aiming to transform and revolutionise the domain of public education and its ties with the job market by leveraging blockchain, analytics and decision support to develop a platform for the verification and sharing of education certificates. Blockchain plays an integral part in the QualiChain solution in providing a trustworthy environment to store, share and manage such accreditations. Under the context of this paper, three prominent blockchain platforms (Ethereum, Hyperledger Fabric, Hyperledger Iroha) were considered as a means of experimentation for creating a system with the basic functionalities that will be needed for trustworthy degree verification. The methodology and respective system developed and presented in this paper used Hyperledger Iroha and proved that this specific platform can be used to easily develop decentralize applications. Future papers will attempt to further experiment with other blockchain platforms and assess which has the best potential.

Keywords: blockchain, degree verification, higher education certificates, Hyperledger Iroha

Procedia PDF Downloads 141
1700 Hedonic Price Analysis of Consumer Preference for Musa spp in Northern Nigeria

Authors: Yakubu Suleiman, S. A. Musa

Abstract:

The research was conducted to determine the physical characteristics of banana fruits that influenced consumer preferences for the fruit in Northern Nigeria. Socio-economic characteristics of the respondents were also identified. Simple descriptive statistics and Hedonic prices model were used to analyze the data collected for socio-economic and consumer preference respectively with the aid of 1000 structured questionnaires. The result revealed the value of R2 to be 0.633, meaning that, 63.3% of the variation in the banana price was brought about by the explanatory variables included in the model and the variables are: colour, size, degree of ripeness, softness, surface blemish, cleanliness of the fruits, weight, length, and cluster size of fruits. However, the remaining 36.7% could be attributed to the error term or random disturbance in the model. It could also be seen from the calculated result that the intercept was 1886.5 and was statistically significant (P < 0.01), meaning that about N1886.5 worth of banana fruits could be bought by consumers without considering the variables of banana included in the model. Moreover, consumers showed that they have significant preference for colours, size, degree of ripeness, softness, weight, length and cluster size of banana fruits and they were tested to be significant at either P < 0.01, P < 0.05, and P < 0.1 . Moreover, the result also shows that consumers did not show significance preferences to surface blemish, cleanliness and variety of the banana fruit as all of them showed non-significance level with negative signs. Based on the findings of the research, it is hereby recommended that plant breeders and research institutes should concentrate on the production of banana fruits that have those physical characteristics that were found to be statistically significance like cluster size, degree of ripeness,’ softness, length, size, and skin colour.

Keywords: analysis, consumers, preference, variables

Procedia PDF Downloads 343
1699 Simulation-Based Validation of Safe Human-Robot-Collaboration

Authors: Titanilla Komenda

Abstract:

Human-machine-collaboration defines a direct interaction between humans and machines to fulfil specific tasks. Those so-called collaborative machines are used without fencing and interact with humans in predefined workspaces. Even though, human-machine-collaboration enables a flexible adaption to variable degrees of freedom, industrial applications are rarely found. The reasons for this are not technical progress but rather limitations in planning processes ensuring safety for operators. Until now, humans and machines were mainly considered separately in the planning process, focusing on ergonomics and system performance respectively. Within human-machine-collaboration, those aspects must not be seen in isolation from each other but rather need to be analysed in interaction. Furthermore, a simulation model is needed that can validate the system performance and ensure the safety for the operator at any given time. Following on from this, a holistic simulation model is presented, enabling a simulative representation of collaborative tasks – including both, humans and machines. The presented model does not only include a geometry and a motion model of interacting humans and machines but also a numerical behaviour model of humans as well as a Boole’s probabilistic sensor model. With this, error scenarios can be simulated by validating system behaviour in unplanned situations. As these models can be defined on the basis of Failure Mode and Effects Analysis as well as probabilities of errors, the implementation in a collaborative model is discussed and evaluated regarding limitations and simulation times. The functionality of the model is shown on industrial applications by comparing simulation results with video data. The analysis shows the impact of considering human factors in the planning process in contrast to only meeting system performance. In this sense, an optimisation function is presented that meets the trade-off between human and machine factors and aids in a successful and safe realisation of collaborative scenarios.

Keywords: human-machine-system, human-robot-collaboration, safety, simulation

Procedia PDF Downloads 361
1698 Dynamics of India's Nuclear Identity

Authors: Smita Singh

Abstract:

Through the constructivist perspective, this paper explores the transformation of India’s nuclear identity from an irresponsible nuclear weapon power to a ‘de-facto nuclear power’ in the emerging international nuclear order From a nuclear abstainer to a bystander and finally as a ‘de facto nuclear weapon state’, India has put forth its case as a unique and exceptional nuclear power as opposed to Iran, Iraq and North Korea with similar nuclear ambitions, who have been snubbed as ‘rogue states’ by the international community. This paper investigates the reasons behind international community’s gradual acceptance of India’s nuclear weapons capabilities and nuclear identity after the Indo-U.S. Nuclear Deal. In this paper, the central concept of analysis is the inter-subjective nature of identity in the nuclear arena. India’s nuclear behaviour has been discursively constituted by India through evolving images of the ‘self’ and the ‘other.’ India’s sudden heightened global status is not solely the consequence of its 1998 nuclear tests but a calibrated projection as a responsible stakeholder in other spheres such as economic potential, market prospects, democratic credentials and so on. By examining India’s nuclear discourse this paper contends that India has used its material and discursive power in presenting a n striking image as a responsible nuclear weapon power (though not yet a legal nuclear weapon state as per the NPT). By historicising India’s nuclear trajectory through an inter-subjective analysis of identities, this paper moves a step ahead in providing a theoretical interpretation of state actions and nuclear identity construction.

Keywords: nuclear identity, India, constructivism, international stakeholder

Procedia PDF Downloads 439
1697 Utilization of Composite Components for Land Vehicle Systems: A Review

Authors: Kivilcim Ersoy, Cansu Yazganarikan

Abstract:

In recent years, composite materials are more frequently utilized not only in aviation but also in automotive industry due to its high strength to weight ratio, fatigue and corrosion resistances as well as better performances in specific environments. The market demand also favors lightweight design for wheeled and tracked armored vehicles due to the increased demand for land and amphibious mobility features. This study represents the current application areas and trends in automotive, bus and armored land vehicles industries. In addition, potential utilization areas of fiber composite and hybrid material concepts are being addressed. This work starts with a survey of current applications and patent trends of composite materials in automotive and land vehicle industries. An intensive investigation is conducted to determine the potential of these materials for application in land vehicle industry, where small series production dominates and challenging requirements are concerned. In the end, potential utilization areas for combat land vehicle systems are offered. By implementing these light weight solutions with alternative materials and design concepts, it is possible to achieve drastic weight reduction, which will enable both land and amphibious mobility without unyielding stiffness and survivability capabilities.

Keywords: land vehicle, composite, light-weight design, armored vehicle

Procedia PDF Downloads 464
1696 Cognitive Development Theories as Determinant of Children's Brand Recall and Ad Recognition: An Indian Perspective

Authors: Ruchika Sharma

Abstract:

In the past decade, there has been an explosion of research that has examined children’s understanding of TV advertisements and its persuasive intent, socialization of child consumer and child psychology. However, it is evident from the literature review that no studies in this area have covered advertising messages and its impact on children’s brand recall and ad recognition. Copywriters use various creative devices to lure the consumers and very impressionable consumers such as children face far more drastic effects of these creative ways of persuasion. On the basis of Piaget’s theory of cognitive development as a theoretical basis for predicting/understanding children’s response and understanding, a quasi-experiment was carried out for the study, that manipulated measurement timing and advertising messages (familiar vs. unfamiliar) keeping gender and age group as two prominent factors. This study also examines children’s understanding of Advertisements and its elements, predominantly - Language, keeping in view Fishbein’s model. Study revealed significant associations between above mentioned factors and children’s brand recall and ad identification. Further, to test the reliability of the findings on larger sample, bootstrap simulation technique was used. The simulation results are in accordance with the findings of experiment, suggesting that the conclusions obtained from the study can be generalized for entire children’s (as consumers) market in India.

Keywords: advertising, brand recall, cognitive development, preferences

Procedia PDF Downloads 290
1695 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling

Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed

Abstract:

The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.

Keywords: streamflow, neural network, optimisation, algorithm

Procedia PDF Downloads 152
1694 Correlation between Cephalometric Measurements and Visual Perception of Facial Profile in Skeletal Type II Patients

Authors: Choki, Supatchai Boonpratham, Suwannee Luppanapornlarp

Abstract:

The objective of this study was to find a correlation between cephalometric measurements and visual perception of facial profile in skeletal type II patients. In this study, 250 lateral cephalograms of female patients from age, 20 to 22 years were analyzed. The profile outlines of all the samples were hand traced and transformed into silhouettes by the principal investigator. Profile ratings were done by 9 orthodontists on Visual Analogue Scale from score one to ten (increasing level of convexity). 37 hard issue and soft tissue cephalometric measurements were analyzed by the principal investigator. All the measurements were repeated after 2 weeks interval for error assessment. At last, the rankings of visual perceptions were correlated with cephalometric measurements using Spearman correlation coefficient (P < 0.05). The results show that the increase in facial convexity was correlated with higher values of ANB (A point, nasion and B point), AF-BF (distance from A point to B point in mm), L1-NB (distance from lower incisor to NB line in mm), anterior maxillary alveolar height, posterior maxillary alveolar height, overjet, H angle hard tissue, H angle soft tissue and lower lip to E plane (absolute correlation values from 0.277 to 0.711). In contrast, the increase in facial convexity was correlated with lower values of Pg. to N perpendicular and Pg. to NB (mm) (absolute correlation value -0.302 and -0.294 respectively). From the soft tissue measurements, H angles had a higher correlation with visual perception than facial contour angle, nasolabial angle, and lower lip to E plane. In conclusion, the findings of this study indicated that the correlation of cephalometric measurements with visual perception was less than expected. Only 29% of cephalometric measurements had a significant correlation with visual perception. Therefore, diagnosis based solely on cephalometric analysis can result in failure to meet the patient’s esthetic expectation.

Keywords: cephalometric measurements, facial profile, skeletal type II, visual perception

Procedia PDF Downloads 138
1693 Concept for Determining the Focus of Technology Monitoring Activities

Authors: Guenther Schuh, Christina Koenig, Nico Schoen, Markus Wellensiek

Abstract:

Identification and selection of appropriate product and manufacturing technologies are key factors for competitiveness and market success of technology-based companies. Therefore many companies perform technology intelligence (TI) activities to ensure the identification of evolving technologies at the right time. Technology monitoring is one of the three base activities of TI, besides scanning and scouting. As the technological progress is accelerating, more and more technologies are being developed. Against the background of limited resources it is therefore necessary to focus TI activities. In this paper, we propose a concept for defining appropriate search fields for technology monitoring. This limitation of search space leads to more concentrated monitoring activities. The concept will be introduced and demonstrated through an anonymized case study conducted within an industry project at the Fraunhofer Institute for Production Technology. The described concept provides a customized monitoring approach, which is suitable for use in technology-oriented companies especially those that have not yet defined an explicit technology strategy. It is shown in this paper that the definition of search fields and search tasks are suitable methods to define topics of interest and thus to direct monitoring activities. Current as well as planned product, production and material technologies as well as existing skills, capabilities and resources form the basis of the described derivation of relevant search areas. To further improve the concept of technology monitoring the proposed concept should be extended during future research e.g. by the definition of relevant monitoring parameters.

Keywords: monitoring radar, search field, technology intelligence, technology monitoring

Procedia PDF Downloads 474
1692 The Role of Health Tourism in Enhancing the Quality of life and Cultural Transmission in Developing Countries

Authors: Fatemeh Noughani, Seyd Mehdi Sadat

Abstract:

Medical tourism or travel therapy is travelling from one country to another to be under medical treatment, utilizing the health factors of natural sector like mineral water springs and so on. From 1990s medical tourism around the world developed and grew because of different factors like globalization and free trade in the fields of health services, changes in exchange rates in the world economy (which caused the desirability of Asian countries as a medical tourist attraction) in a way that currently there is a close competition in this field among famous countries in medical services to make them find a desirable place in medical tourism market of the world as a complicated and growing industry in a short time. Perhaps tourism is an attractive industry and a good support for the economy of Iran, if we try to merge oil earnings and tourism industry it would be better and more constructive than putting them in front of each other. Moving from oil toward tourism economy especially medical tourism, must be one of the prospects of Iran's government for the oil industry to provide a few percent of the yearly earnings of the country. Among the achievements in medical tourism we can name the prevention of brain drain to other countries and an increase in employment rate for healthcare staff, increase in foreign exchange earnings of the country because of the tourists' staying and followed by increasing the quality of life and cultural transmission as well as empowering the medical human resources.

Keywords: developing countries, health tourism, quality of life, cultural transmission

Procedia PDF Downloads 436
1691 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data

Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar

Abstract:

It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.

Keywords: accuracy, exponential smoothing, forecasting, initial value

Procedia PDF Downloads 177
1690 Surveillance of Adverse Events Following Immunization during New Vaccines Introduction in Cameroon: A Cross-Sectional Study on the Role of Mobile Technology

Authors: Andreas Ateke Njoh, Shalom Tchokfe Ndoula, Amani Adidja, Germain Nguessan Menan, Annie Mengue, Eric Mboke, Hassan Ben Bachir, Sangwe Clovis Nchinjoh, Yauba Saidu, Laurent Cleenewerck De Kiev

Abstract:

Vaccines serve a great deal in protecting the population globally. Vaccine products are subject to rigorous quality control and approval before use to ensure safety. Even if all actors take the required precautions, some people could still have adverse events following immunization (AEFI) caused by the vaccine composition or an error in its administration. AEFI underreporting is pronounced in low-income settings like Cameroon. The Country introduced electronic platforms to strengthen surveillance. With the introduction of many novel vaccines, like COVID-19 and the novel Oral Polio Vaccine (nOPV) 2, there was a need to monitor AEFI in the Country. A cross-sectional study was conducted from July to December 2022. Data on AEFI per region of Cameroon were reviewed for the past five years. Data were analyzed with MS Excel, and the results were presented in proportions. AEFI reporting was uncommon in Cameroon. With the introduction of novel vaccines in 2021, the health authorities engaged in new tools and training to capture cases. AEFI detected almost doubled using the open data kit (ODK) compared to previous platforms, especially following the introduction of the nOPV2 and COVID-19 vaccines. The AEFI rate was 1.9 and 160 per administered 100 000 doses of nOPV2 and COVID-19 vaccines, respectively. This mobile tool captured individual information for people with AEFI from all regions. The platform helped to identify common AEFI following the use of these new vaccines. The ODK mobile technology was vital in improving AEFI reporting and providing data to monitor using new vaccines in Cameroon.

Keywords: adverse events following immunization, cameroon, COVID-19 vaccines, nOPV, ODK

Procedia PDF Downloads 88
1689 Creating Database and Building 3D Geological Models: A Case Study on Bac Ai Pumped Storage Hydropower Project

Authors: Nguyen Chi Quang, Nguyen Duong Tri Nguyen

Abstract:

This article is the first step to research and outline the structure of the geotechnical database in the geological survey of a power project; in the context of this report creating the database that has been carried out for the Bac Ai pumped storage hydropower project. For the purpose of providing a method of organizing and storing geological and topographic survey data and experimental results in a spatial database, the RockWorks software is used to bring optimal efficiency in the process of exploiting, using, and analyzing data in service of the design work in the power engineering consulting. Three-dimensional (3D) geotechnical models are created from the survey data: such as stratigraphy, lithology, porosity, etc. The results of the 3D geotechnical model in the case of Bac Ai pumped storage hydropower project include six closely stacked stratigraphic formations by Horizons method, whereas modeling of engineering geological parameters is performed by geostatistical methods. The accuracy and reliability assessments are tested through error statistics, empirical evaluation, and expert methods. The three-dimensional model analysis allows better visualization of volumetric calculations, excavation and backfilling of the lake area, tunneling of power pipelines, and calculation of on-site construction material reserves. In general, the application of engineering geological modeling makes the design work more intuitive and comprehensive, helping construction designers better identify and offer the most optimal design solutions for the project. The database always ensures the update and synchronization, as well as enables 3D modeling of geological and topographic data to integrate with the designed data according to the building information modeling. This is also the base platform for BIM & GIS integration.

Keywords: database, engineering geology, 3D Model, RockWorks, Bac Ai pumped storage hydropower project

Procedia PDF Downloads 168
1688 Audit Quality and Audit Regulation in European Union: A Perspective, Considering Actual and Perception Based Measures

Authors: Daniela Monteiro

Abstract:

Considering the entry into force of the new EU audit reform regarding statutory auditors, in effect in all member states since 2016, this research aims to identify which audit regulation rules are associated with a high-level audit quality on both its dimensions, i.e., the actual quality and the perceived quality, in relation to public interest entities, within the European Union, and whether those rules have the same impact on both dimensions. Its measurement was based on the following proxies: the quality of financial information through earnings management and the impact of qualified opinions on financial costs. We considered in the research regulation subjects such as auditors’ rotation and provision of services (NAS) and also the level of market concentration. The criteria to include these issues in the research was its contemplation of the new rules. We studied the period before the audit reform (2009-2015) when the regulation measures were less uniform. Besides the consideration of both dimensions of audit quality and several regulation measures, we believe our conclusions configure an important contribution to this research field, considering the involvement of the first 15 member states of the European Union. The results consolidate the assumption that the balance between competence and independence is not the only challenge related to the regulation of the audit profession. The evidence demonstrates that the balance between actual and perceived quality is also a relevant matter. The major conclusion is that the challenge is to keep balanced both actual and perceived audit quality whilst ensuring the independence and competence of auditors.

Keywords:

Procedia PDF Downloads 181
1687 Advanced Driver Assistance System: Veibra

Authors: C. Fernanda da S. Sampaio, M. Gabriela Sadith Perez Paredes, V. Antonio de O. Martins

Abstract:

Today the transport sector is undergoing a revolution, with the rise of Advanced Driver Assistance Systems (ADAS), industry and society itself will undergo a major transformation. However, the technological development of these applications is a challenge that requires new techniques and great machine learning and artificial intelligence. The study proposes to develop a vehicular perception system called Veibra, which consists of two front cameras for day/night viewing and an embedded device capable of working with Yolov2 image processing algorithms with low computational cost. The strategic version for the market is to assist the driver on the road with the detection of day/night objects, such as road signs, pedestrians, and animals that will be viewed through the screen of the phone or tablet through an application. The system has the ability to perform real-time driver detection and recognition to identify muscle movements and pupils to determine if the driver is tired or inattentive, analyzing the student's characteristic change and following the subtle movements of the whole face and issuing alerts through beta waves to ensure the concentration and attention of the driver. The system will also be able to perform tracking and monitoring through GSM (Global System for Mobile Communications) technology and the cameras installed in the vehicle.

Keywords: advanced driver assistance systems, tracking, traffic signal detection, vehicle perception system

Procedia PDF Downloads 155
1686 Evaluating Environmental Impact of End-of-Life Cycle Cases for Brick Walls and Aerated Autoclave Concrete Walls

Authors: Ann Mariya Jose, Ashfina T.

Abstract:

Construction and demolition waste is one of the rising concerns globally due to the amount of waste generated annually, the area taken up by landfills, and the adverse environmental impacts that follow. One of the primary causes of the rise in construction and demolition waste is a lack of facilities and knowledge for incorporating recycled materials into new construction. Bricks are a conventional material that has been used for construction for centuries, and Autoclave Aerated Concrete (AAC) blocks are a new emergent material in the market. This study evaluates the impact brick walls, and AAC block walls have on the environment using the tool One Click LCA, considering three End of Life (EoL) scenarios: the materials are landfilled, recycled, and reused in a new building. The final objective of the study is to evaluate the environmental impact caused by these two different walls on the environmental factors such as Global Warming Potential (GWP), Acidification Potential (AP), Eutrophication Potential (EP), Ozone Depletion Potential (ODP), and Photochemical Ozone Creation Potential (POCP). The findings revealed that the GWP caused by landfilling is 16 times higher in bricks and 22 times higher in AAC blocks when compared to the reuse of materials. The study recommends the effective use of AAC blocks in construction and reuse of the same to reduce the overall emissions to the environment.

Keywords: construction and demolition waste, environmental impact, life cycle impact assessment, material recycling

Procedia PDF Downloads 105
1685 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing

Authors: Yehjune Heo

Abstract:

As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.

Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer

Procedia PDF Downloads 136
1684 Quantitative Evaluation of Supported Catalysts Key Properties from Electron Tomography Studies: Assessing Accuracy Using Material-Realistic 3D-Models

Authors: Ainouna Bouziane

Abstract:

The ability of Electron Tomography to recover the 3D structure of catalysts, with spatial resolution in the subnanometer scale, has been widely explored and reviewed in the last decades. A variety of experimental techniques, based either on Transmission Electron Microscopy (TEM) or Scanning Transmission Electron Microscopy (STEM) have been used to reveal different features of nanostructured catalysts in 3D, but High Angle Annular Dark Field imaging in STEM mode (HAADF-STEM) stands out as the most frequently used, given its chemical sensitivity and avoidance of imaging artifacts related to diffraction phenomena when dealing with crystalline materials. In this regard, our group has developed a methodology that combines image denoising by undecimated wavelet transforms (UWT) with automated, advanced segmentation procedures and parameter selection methods using CS-TVM (Compressed Sensing-total variation minimization) algorithms to reveal more reliable quantitative information out of the 3D characterization studies. However, evaluating the accuracy of the magnitudes estimated from the segmented volumes is also an important issue that has not been properly addressed yet, because a perfectly known reference is needed. The problem particularly complicates in the case of multicomponent material systems. To tackle this key question, we have developed a methodology that incorporates volume reconstruction/segmentation methods. In particular, we have established an approach to evaluate, in quantitative terms, the accuracy of TVM reconstructions, which considers the influence of relevant experimental parameters like the range of tilt angles, image noise level or object orientation. The approach is based on the analysis of material-realistic, 3D phantoms, which include the most relevant features of the system under analysis.

Keywords: electron tomography, supported catalysts, nanometrology, error assessment

Procedia PDF Downloads 88
1683 In vivo Mechanical Characterization of Facial Skin Combining Digital Image Correlation and Finite Element

Authors: Huixin Wei, Shibin Wang, Linan Li, Lei Zhou, Xinhao Tu

Abstract:

Facial skin is a biomedical material with complex mechanical properties of anisotropy, viscoelasticity, and hyperelasticity. The mechanical properties of facial skin are crucial for a number of applications including facial plastic surgery, animation, dermatology, cosmetic industry, and impact biomechanics. Skin is a complex multi-layered material which can be broadly divided into three main layers, the epidermis, the dermis, and the hypodermis. Collagen fibers account for 75% of the dry weight of dermal tissue, and it is these fibers which are responsible for the mechanical properties of skin. Many research on the anisotropic mechanical properties are mainly concentrated on in vitro, but there is a great difference between in vivo and in vitro for mechanical properties of the skin. In this study, we presented a method to measure the mechanical properties of facial skin in vivo. Digital image correlation (DIC) and indentation tests were used to obtain the experiment data, including the deformation of facial surface and indentation force-displacement curve. Then, the experiment was simulated using a finite element (FE) model. Application of Computed Tomography (CT) and reconstruction techniques obtained the real tissue geometry. A three-dimensional FE model of facial skin, including a bi-layer system, was obtained. As the epidermis is relatively thin, the epidermis and dermis were regarded as one layer and below it was hypodermis in this study. The upper layer was modeled as a Gasser-Ogden-Holzapfel (GOH) model to describe hyperelastic and anisotropic behaviors of the dermis. The under layer was modeled as a linear elastic model. In conclusion, the material properties of two-layer were determined by minimizing the error between the FE data and experimental data.

Keywords: facial skin, indentation test, finite element, digital image correlation, computed tomography

Procedia PDF Downloads 112
1682 The Effect of Low Power Laser on CK and Some of Markers Delayed Onset Muscle Soreness (DOMS)

Authors: Bahareh Yazdanparast Chaharmahali

Abstract:

The study showed effect of low power laser therapy on knee range of motion (flexion and extension), resting angle of knee joint, knee circumference and rating of delayed onset muscle soreness induced pain, 24 and 48 hours after eccentric training of knee flexor muscle (hamstring muscle). We investigate the effects of pulsed ultrasound on swelling, relaxed, flexion and extension knee angle and pain. 20 volunteers among girl students of college voluntary participated in this research. After eccentric training, subjects were randomly divided into two groups, control and laser therapy. In day 1 and in order to induce delayed onset muscle soreness, subjects eccentrically trained their knee flexor muscles. In day 2, subjects were randomly divided into two groups: control and low power laser therapy. 24 and 48 hours after eccentric training. Variables (knee flexion and extension, srang of motion, resting knee joint angle and knee circumferences) were measured and analyzed. Data are reported as means ± standard error (SE) and repeated measured was used to assess differences within groups. Methods of treatment (low power laser therapy) have significant effects on delayed onset muscle soreness markers. 24 and 48 hours after training a significant difference was observed between mean pains of 2 groups. This difference was significant between low power laser therapy and C groups. The Bonferroni post hock is significant. Low power laser therapy trophy as used in this study did significantly diminish the effects of delayed – onset muscle soreness on swelling, relaxed – knee extension and flexion angle.

Keywords: creatine kinase, DOMS, eccentric training, low power laser

Procedia PDF Downloads 246
1681 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 338
1680 Hydrothermal Energy Application Technology Using Dam Deep Water

Authors: Yooseo Pang, Jongwoong Choi, Yong Cho, Yongchae Jeong

Abstract:

Climate crisis, such as environmental problems related to energy supply, is getting emerged issues, so the use of renewable energy is essentially required to solve these problems, which are mainly managed by the Paris Agreement, the international treaty on climate change. The government of the Republic of Korea announced that the key long-term goal for a low-carbon strategy is “Carbon neutrality by 2050”. It is focused on the role of the internet data centers (IDC) in which large amounts of data, such as artificial intelligence (AI) and big data as an impact of the 4th industrial revolution, are managed. The demand for the cooling system market for IDC was about 9 billion US dollars in 2020, and 15.6% growth a year is expected in Korea. It is important to control the temperature in IDC with an efficient air conditioning system, so hydrothermal energy is one of the best options for saving energy in the cooling system. In order to save energy and optimize the operating conditions, it has been considered to apply ‘the dam deep water air conditioning system. Deep water at a specific level from the dam can supply constant water temperature year-round. It will be tested & analyzed the amount of energy saving with a pilot plant that has 100RT cooling capacity. Also, a target of this project is 1.2 PUE (Power Usage Effectiveness) which is the key parameter to check the efficiency of the cooling system.

Keywords: hydrothermal energy, HVAC, internet data center, free-cooling

Procedia PDF Downloads 81
1679 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique

Authors: Rafid Doulab

Abstract:

Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.

Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration

Procedia PDF Downloads 116
1678 Starch Valorization: Biorefinery Concept for the Circular Bioeconomy

Authors: Maider Gómez Palmero, Ana Carrasco Pérez, Paula de la Sen de la Cruz, Francisco Javier Royo Herrer, Sonia Ascaso Malo

Abstract:

The production of bio-based products for different purposes is one of the strategies that has grown the most at European and even global levels, seeking to contribute to mitigating the impacts associated with climate change and to achieve the ambitious objectives set in this regard. However, the substitution of fossil-based products for bio-based products requires a challenging and deep transformation and adaptation of the secondary and primary sectors and, more specifically, in the latter, the agro-industries. The first step to developing a bio-based value chain focuses on the availability of a resource with the right characteristics for the substitution sought. This, in turn, requires a significant reshaping of the forestry/agricultural sector but also of the agro-industry, which has a relevant potential to be deployed as a supplier and develop a robust logistical supply chain and to market a biobased raw material at a competitive price. However, this transformation may involve a profound restructuring of its traditional business model to incorporate biorefinery concepts. In this sense, agro-industries that generate by-products in their processes that are currently not valorized, such as potato processing rejects or the starch found in washing water, constitute a potential raw material that can be used for different bio-applications. This article aims to explore this potential to evaluate the most suitable bio applications to target and identify opportunities and challenges.

Keywords: starch valorisation, biorefinery, bio-based raw materials, bio-applications

Procedia PDF Downloads 51
1677 Human Resource Development Strategy in Automotive Industry (Eco-Car) for ASEAN Hub

Authors: Phichak Phutrakhul

Abstract:

The purposes of this research were to study concepts and strategies of human resource development in the automotive manufacturers and to articulate the proposals against the government about the human resource development for automotive industry. In the present study, qualitative study was an in-depth interview in which the qualitative data were collected from the executive or the executive of human resource division from five automotive companies - Toyota Motor (Thailand) Co., Ltd., Nissan Motor (Thailand) Co., Ltd., Mitsubishi Motors (Thailand) Co., Ltd., Honda Automobile (Thailand) Co., Ltd., and Suzuki Motor (Thailand) Co., Ltd. Qualitative data analysis was performed by using inter-coder agreement technique. The research findings were as follows: The external factors included the current conditions of the automotive industry, government’s policy related to the automotive industry, technology, labor market and human resource development systems of the country. The internal factors included management, productive management, organizational strategies, leadership, organizational culture and philosophy of human resource development. These factors were affected to the different concept of human resources development -the traditional human resource development and the strategies of human resource development. The organization focuses on human resources as intellectual capital and uses the strategies of human resource development in all development processes. The strategies of human resource development will enhance the ability of human resources in the organization and the country.

Keywords: human resource development strategy, automotive industry, eco-cars, ASEAN

Procedia PDF Downloads 471
1676 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 491
1675 An Evaluation of the Effects of Special Safeguards in Meat upon International Trade and the Brazilian Economy

Authors: Cinthia C. Costa, Heloisa L. Burnquist, Joaquim J. M. Guilhoto

Abstract:

This study identified the impact of special agricultural safeguards (SSG) for the global market of meat and for the Brazilian economy. The tariff lines subject to SSG were selected and the period of analysis was 1995 (when the rules about the SSGs were established) to 2015 (more recent period for which there are notifications). The value of additional tariff was calculated for each of the most important tariff lines. The import volume and the price elasticities for imports were used to estimate the impacts of each additional tariff estimated on imports. Finally, the effect of Brazilian exports of meat without SSG taxes was calculated as well as its impact in the country’s economy by using an input-output matrix. The most important markets that applied SSGs were the U.S. for beef and European Union for poultry. However, the additional tariffs could be estimated in only two of the sixteen years that the U.S. applied SSGs on beef imports, suggesting that its use has been enforced when the average annual price has been higher than the trigger price level. The results indicated that the value of the bovine and poultry meat that could not be exported by Brazil due to SSGs to both markets (EU and the U.S.) was equivalent to BRL 804 million. The impact of this loss in trade was about: BRL 3.7 billion of the economy’s production value (at 2015 prices) and almost BRL 2 billion of the Brazilian Gross Domestic Product (GDP).

Keywords: beef, poultry meat, SSG tariff, input-output matrix, Brazil

Procedia PDF Downloads 121
1674 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers' equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile

Procedia PDF Downloads 169