Search results for: features engineering methods for forecasting
19692 Stating Best Commercialization Method: An Unanswered Question from Scholars and Practitioners
Authors: Saheed A. Gbadegeshin
Abstract:
Commercialization method is a means to make inventions available at the market for final consumption. It is described as an important tool for keeping business enterprises sustainable and improving national economic growth. Thus, there are several scholarly publications on it, either presenting or testing different methods for commercialization. However, young entrepreneurs, technologists and scientists would like to know the best method to commercialize their innovations. Then, this question arises: What is the best commercialization method? To answer the question, a systematic literature review was conducted, and practitioners were interviewed. The literary results revealed that there are many methods but new methods are needed to improve commercialization especially during these times of economic crisis and political uncertainty. Similarly, the empirical results showed there are several methods, but the best method is the one that reduces costs, reduces the risks associated with uncertainty, and improves customer participation and acceptability. Therefore, it was concluded that new commercialization method is essential for today's high technologies and a method was presented.Keywords: commercialization method, technology, knowledge, intellectual property, innovation, invention
Procedia PDF Downloads 34219691 Assessment of the Implementation of Recommended Teaching and Evaluation Methods of NCE Arabic Language Curriculum in Colleges of Education in North Western Nigeria
Authors: Hamzat Shittu Atunnise
Abstract:
This study on Assessment of the Implementation of Recommended Teaching and Evaluation Methods of the Nigeria Certificate in Education (NCE) Arabic Language Curriculum in Colleges of Education in North Western Nigeria was conducted with four objectives, four research questions and four null hypotheses. Descriptive survey design was used and the multistage sampling procedure adopted. Frequency count and percentage were used to answer research questions and chi-square was used to test all the null hypotheses at an Alpha 0.05 level of significance. Two hundred and ninety one subjects were drawn as sample. Questionnaires were used for data collection. The Context, Input, Process and Product (CIPP) model of evaluation was employed. The study findings indicated that: there were no significant difference in the perceptions of lecturers and students from Federal and State Colleges of Education on the following: extent of which lecturers employ appropriate methods in teaching the language and extent of which recommended evaluation methods are utilized for the implementation of Arabic Curriculum. Based on these findings, it was recommended among other things that: lecturers should adopt teaching methodologies that promote interactive learning; Governments should ensure that information and communication technology facilities are made available and usable in all Colleges of Education; Lecturers should vary their evaluation methods because other methods of evaluation can meet and surpass the level of learning and understanding which essay type questions are believed to create and that language labs should be used in teaching Arabic in Colleges of Education because comprehensive language learning is possible through both classroom and language lab teaching.Keywords: assessment, arabic language, curriculum, methods of teaching, evaluation methods, NCE
Procedia PDF Downloads 6019690 Critical Factors for Successful Adoption of Land Value Capture Mechanisms – An Exploratory Study Applied to Indian Metro Rail Context
Authors: Anjula Negi, Sanjay Gupta
Abstract:
Paradigms studied inform inadequacies of financial resources, be it to finance metro rails for construction or to meet operational revenues or to derive profits in the long term. Funding sustainability is far and wide for much-needed public transport modes, like urban rail or metro rails, to be successfully operated. India embarks upon a sustainable transport journey and has proposed metro rail systems countrywide. As an emerging economic leader, its fiscal constraints are paramount, and the land value capture (LVC) mechanism provides necessary support and innovation toward development. India’s metro rail policy promotes multiple methods of financing, including private-sector investments and public-private-partnership. The critical question that remains to be addressed is what factors can make such mechanisms work. Globally, urban rail is a revolution noted by many researchers as future mobility. Researchers in this study deep dive by way of literature review and empirical assessments into factors that can lead to the adoption of LVC mechanisms. It is understood that the adoption of LVC methods is in the nascent stages in India. Research posits numerous challenges being faced by metro rail agencies in raising funding and for incremental value capture. A few issues pertaining to land-based financing, inter alia: are long-term financing, inter-institutional coordination, economic/ market suitability, dedicated metro funds, land ownership issues, piecemeal approach to real estate development, property development legal frameworks, etc. The question under probe is what are the parameters that can lead to success in the adoption of land value capture (LVC) as a financing mechanism. This research provides insights into key parameters crucial to the adoption of LVC in the context of Indian metro rails. Researchers have studied current forms of LVC mechanisms at various metro rails of the country. This study is significant as little research is available on the adoption of LVC, which is applicable to the Indian context. Transit agencies, State Government, Urban Local Bodies, Policy makers and think tanks, Academia, Developers, Funders, Researchers and Multi-lateral agencies may benefit from this research to take ahead LVC mechanisms in practice. The study deems it imperative to explore and understand key parameters that impact the adoption of LVC. Extensive literature review and ratification by experts working in the metro rails arena were undertaken to arrive at parameters for the study. Stakeholder consultations in the exploratory factor analysis (EFA) process were undertaken for principal component extraction. 43 seasoned and specialized experts participated in a semi-structured questionnaire to scale the maximum likelihood on each parameter, represented by various types of stakeholders. Empirical data was collected on chosen eighteen parameters, and significant correlation was extracted for output descriptives and inferential statistics. Study findings reveal these principal components as institutional governance framework, spatial planning features, legal frameworks, funding sustainability features and fiscal policy measures. In particular, funding sustainability features highlight sub-variables of beneficiaries to pay and use of multiple revenue options towards success in LVC adoption. Researchers recommend incorporation of these variables during early stage in design and project structuring for success in adoption of LVC. In turn leading to improvements in revenue sustainability of a public transport asset and help in undertaking informed transport policy decisions.Keywords: Exploratory factor analysis, land value capture mechanism, financing metro rails, revenue sustainability, transport policy
Procedia PDF Downloads 8119689 The Triple Interpretation of German Historicism and its Theoretical Contribution to Historical Materialism
Authors: Dandan Zhang
Abstract:
Elucidating the original relationship between historical materialism and German historicism from the internal dimension of intellectual history has important theoretical significance for deep understanding and interpretation of the essential characteristics of historical materialism. German historicism contains the triple deduction of scientific historicism, historical relativism, and vitalism. The historicism of science argues for its historical status as science in the name of objective, systematic, and typical research methods, and procedural principles. Historical relativism places history under the specific historical background to study epistemological and methodological issues about the nature of human beings and the value of history. German historicism walks up to natural and cultural relativism on the basis of romanticism. Vitalism emphasizes intuition, will, and experience of life from individuals and places history on the ontology of organic life and vitality. Historical materialism and German historicism have a theoretical relationship in the genetic field. The former criticizes and surpasses the latter. Meanwhile, in the evolution of German historicism, the differences between historical materialism with it are essential features of historical science.Keywords: German historicism, scientific historicism, historical relativism, vitalism, historical materialism
Procedia PDF Downloads 4419688 Human Identification Using Local Roughness Patterns in Heartbeat Signal
Authors: Md. Khayrul Bashar, Md. Saiful Islam, Kimiko Yamashita, Yano Midori
Abstract:
Despite having some progress in human authentication, conventional biometrics (e.g., facial features, fingerprints, retinal scans, gait, voice patterns) are not robust against falsification because they are neither confidential nor secret to an individual. As a non-invasive tool, electrocardiogram (ECG) has recently shown a great potential in human recognition due to its unique rhythms characterizing the variability of human heart structures (chest geometry, sizes, and positions). Moreover, ECG has a real-time vitality characteristic that signifies the live signs, which ensure legitimate individual to be identified. However, the detection accuracy of the current ECG-based methods is not sufficient due to a high variability of the individual’s heartbeats at a different instance of time. These variations may occur due to muscle flexure, the change of mental or emotional states, and the change of sensor positions or long-term baseline shift during the recording of ECG signal. In this study, a new method is proposed for human identification, which is based on the extraction of the local roughness of ECG heartbeat signals. First ECG signal is preprocessed using a second order band-pass Butterworth filter having cut-off frequencies of 0.00025 and 0.04. A number of local binary patterns are then extracted by applying a moving neighborhood window along the ECG signal. At each instant of the ECG signal, the pattern is formed by comparing the ECG intensities at neighboring time points with the central intensity in the moving window. Then, binary weights are multiplied with the pattern to come up with the local roughness description of the signal. Finally, histograms are constructed that describe the heartbeat signals of individual subjects in the database. One advantage of the proposed feature is that it does not depend on the accuracy of detecting QRS complex, unlike the conventional methods. Supervised recognition methods are then designed using minimum distance to mean and Bayesian classifiers to identify authentic human subjects. An experiment with sixty (60) ECG signals from sixty adult subjects from National Metrology Institute of Germany (NMIG) - PTB database, showed that the proposed new method is promising compared to a conventional interval and amplitude feature-based method.Keywords: human identification, ECG biometrics, local roughness patterns, supervised classification
Procedia PDF Downloads 40419687 Specified Human Motion Recognition and Unknown Hand-Held Object Tracking
Authors: Jinsiang Shaw, Pik-Hoe Chen
Abstract:
This paper aims to integrate human recognition, motion recognition, and object tracking technologies without requiring a pre-training database model for motion recognition or the unknown object itself. Furthermore, it can simultaneously track multiple users and multiple objects. Unlike other existing human motion recognition methods, our approach employs a rule-based condition method to determine if a user hand is approaching or departing an object. It uses a background subtraction method to separate the human and object from the background, and employs behavior features to effectively interpret human object-grabbing actions. With an object’s histogram characteristics, we are able to isolate and track it using back projection. Hence, a moving object trajectory can be recorded and the object itself can be located. This particular technique can be used in a camera surveillance system in a shopping area to perform real-time intelligent surveillance, thus preventing theft. Experimental results verify the validity of the developed surveillance algorithm with an accuracy of 83% for shoplifting detection.Keywords: Automatic Tracking, Back Projection, Motion Recognition, Shoplifting
Procedia PDF Downloads 33319686 Comparing the Experimental Thermal Conductivity Results Using Transient Methods
Authors: Sofia Mylona, Dale Hume
Abstract:
The main scope of this work is to compare the experimental thermal conductivity results of fluids between devices using transient techniques. A range of different liquids within a range of viscosities was measured with two or more devices, and the results were compared between the different methods and the reference equations wherever it was available. The liquids selected are the most commonly used in academic or industrial laboratories to calibrate their thermal conductivity instruments having a variety of thermal conductivity, viscosity, and density. Three transient methods (Transient Hot Wire, Transient Plane Source, and Transient Line Source) were compared for the thermal conductivity measurements taken by using them. These methods have been chosen as the most accurate and because they all follow the same idea; as a function of the logarithm of time, the thermal conductivity is calculated from the slope of a plot of sensor temperature rise. For all measurements, the selected temperature range was at the atmospheric level from 10 to 40 ° C. Our results are coming with an agreement with the objections of several scientists over the reliability of the results of a few popular devices. The observation was surprising that the device used in many laboratories for fast measurements of liquid thermal conductivity display deviations of 500 percent which can be very poorly reproduced.Keywords: accurate data, liquids, thermal conductivity, transient methods.
Procedia PDF Downloads 16019685 Selection the Most Suitable Method for DNA Extraction from Muscle of Iran's Canned Tuna by Comparison of Different DNA Extraction Methods
Authors: Marjan Heidarzadeh
Abstract:
High quality and purity of DNA isolated from canned tuna is essential for species identification. In this study, the efficiency of five different methods for DNA extraction was compared. Method of national standard in Iran, the CTAB precipitation method, Wizard DNA Clean Up system, Nucleospin and GenomicPrep were employed. DNA was extracted from two different canned tuna in brine and oil of the same tuna species. Three samples of each type of product were analyzed with the different methods. The quantity and quality of DNA extracted was evaluated using the 260 nm absorbance and ratio A260/A280 by spectrophotometer picodrop. Results showed that the DNA extraction from canned tuna preserved in different liquid media could be optimized by employing a specific DNA extraction method in each case. Best results were obtained with CTAB method for canned tuna in oil and with Wizard method for canned tuna in brine.Keywords: canned tuna PCR, DNA, DNA extraction methods, species identification
Procedia PDF Downloads 65719684 Predictive Modeling of Student Behavior in Virtual Reality: A Machine Learning Approach
Authors: Gayathri Sadanala, Shibam Pokhrel, Owen Murphy
Abstract:
In the ever-evolving landscape of education, Virtual Reality (VR) environments offer a promising avenue for enhancing student engagement and learning experiences. However, understanding and predicting student behavior within these immersive settings remain challenging tasks. This paper presents a comprehensive study on the predictive modeling of student behavior in VR using machine learning techniques. We introduce a rich data set capturing student interactions, movements, and progress within a VR orientation program. The dataset is divided into training and testing sets, allowing us to develop and evaluate predictive models for various aspects of student behavior, including engagement levels, task completion, and performance. Our machine learning approach leverages a combination of feature engineering and model selection to reveal hidden patterns in the data. We employ regression and classification models to predict student outcomes, and the results showcase promising accuracy in forecasting behavior within VR environments. Furthermore, we demonstrate the practical implications of our predictive models for personalized VR-based learning experiences and early intervention strategies. By uncovering the intricate relationship between student behavior and VR interactions, we provide valuable insights for educators, designers, and developers seeking to optimize virtual learning environments.Keywords: interaction, machine learning, predictive modeling, virtual reality
Procedia PDF Downloads 14319683 Media Literacy: Information and Communication Technology Impact on Teaching and Learning Methods in Albanian Education System
Authors: Loreta Axhami
Abstract:
Media literacy in the digital age emerges not only as a set of skills to generate true knowledge and information but also as a pedagogy methodology, as a kind of educational philosophy. In addition to such innovations as information integration and communication technologies, media infrastructures, and web usage in the educational system, media literacy enables the change in the learning methods, pedagogy, teaching programs, and school curriculum itself. In this framework, this study focuses on ICT's impact on teaching and learning methods and the degree they are reflected in the Albanian education system. The study is based on a combination of quantitative and qualitative methods of scientific research. Referring to the study findings, it results that student’s limited access to the internet in school, focus on the hardcopy textbooks and the role of the teacher as the only or main source of knowledge and information are some of the main factors contributing to the implementation of authoritarian pedagogical methods in the Albanian education system. In these circumstances, the implementation of media literacy is recommended as an apt educational process for the 21st century, which requires a reconceptualization of textbooks as well as the application of modern teaching and learning methods by integrating information and communication technologies.Keywords: authoritarian pedagogic model, education system, ICT, media literacy
Procedia PDF Downloads 14019682 Socio-Technical Systems: Transforming Theory into Practice
Authors: L. Ngowi, N. H. Mvungi
Abstract:
This paper critically examines the evolution of socio-technical systems theory, its practices, and challenges in system design and development. It examines concepts put forward by researchers focusing on the application of the theory in software engineering. There are various methods developed that use socio-technical concepts based on systems engineering without remarkable success. The main constraint is the large amount of data and inefficient techniques used in the application of the concepts in system engineering for developing time-bound systems and within a limited/controlled budget. This paper critically examines each of the methods, highlight bottlenecks and suggest the way forward. Since socio-technical systems theory only explains what to do, but not how doing it, hence engineers are not using the concept to save time, costs and reduce risks associated with new frameworks. Hence, a new framework, which can be considered as a practical approach is proposed that borrows concepts from soft systems method, agile systems development and object-oriented analysis and design to bridge the gap between theory and practice. The approach will enable the development of systems using socio-technical systems theory to attract/enable the system engineers/software developers to use socio-technical systems theory in building worthwhile information systems to avoid fragilities and hostilities in the work environment.Keywords: socio-technical systems, human centered design, software engineering, cognitive engineering, soft systems, systems engineering
Procedia PDF Downloads 28619681 The Incidence of Postoperative Atrial Fibrillation after Coronary Artery Bypass Grafting in Patients with Local and Diffuse Coronary Artery Disease
Authors: Kamil Ganaev, Elina Vlasova, Andrei Shiryaev, Renat Akchurin
Abstract:
De novo atrial fibrillation (AF) after coronary artery bypass grafting (CABG) is a common complication. To date, there are no data on the possible effect of diffuse lesions of coronary arteries on the incidence of postoperative AF complications. Methods. Patients operated on-pump under hypothermic conditions during the calendar year (2020) were studied. Inclusion criteria - isolated CABG and achievement of complete myocardial revascularization. Patients with a history of AF moderate and severe valve dysfunction, hormonal thyroid pathology, initial CHF(Congestive heart failure), as well as patients with developed perioperative complications (IM, acute heart failure, massive blood loss) and deceased were excluded. Thus 227 patients were included; mean age 65±9 years; 69% were men. 89% of patients had a 3-vessel lesion of the coronary artery; the remainder had a 2-vessel lesion. Mean LV size: 3.9±0.3 cm, indexed LV volume: 29.4±5.3 mL/m2. Two groups were considered: D (n=98), patients with diffuse coronary heart disease, and L (n=129), patients with local coronary heart disease. Clinical and demographic characteristics in the groups were comparable. Rhythm assessment: continuous bedside ECG monitoring up to 5 days; ECG CT at 5-7 days after CABG; daily routine ECG registration. Follow-up period - postoperative hospital period. Results. The Median follow-up period was 9 (7;11) days. POFP (Postoperative atrial fibrillation) was detected in 61/227 (27%) patients: 34/98 (35%) in group D versus 27/129 (21%) in group L; p<0.05. Moreover, the values of revascularization index in groups D and L (3.9±0.7 and 3.8±0.5, respectively) were equal, and the mean time Cardiopulmonary bypass (CPB) (107±27 and 80±13min), as well as the mean ischemic time (67±17 and 55±11min) were significantly longer in group D (p<0.05). However, a separate analysis of these parameters in patients with and without developed AF did not reveal any significant differences in group D (CPB time 99±21.2 min, ischemic time 63±12.2 min), or in group L (CPB time 88±13.1 min, ischemic time 58.7±13.2 min). Conclusion. With the diffuse nature of coronary lesions, the incidence of AF in the hospital period after isolated CABG definitely increases. To better understand the role of severe coronary atherosclerosis in the development of POAF, it is necessary to distinguish the influence of organic features of atrial and ventricular myocardium (as a consequence of chronic coronary disease) from the features of surgical correction in diffuse coronary lesions.Keywords: atrial fibrillation, diffuse coronary artery disease, coronary artery bypass grafting, local coronary artery disease
Procedia PDF Downloads 21219680 Integrating Wearable-Textiles Sensors and IoT for Continuous Electromyography Monitoring
Authors: Bulcha Belay Etana, Benny Malengier, Debelo Oljira, Janarthanan Krishnamoorthy, Lieva Vanlangenhove
Abstract:
Electromyography (EMG) is a technique used to measure the electrical activity of muscles. EMG can be used to assess muscle function in a variety of settings, including clinical, research, and sports medicine. The aim of this study was to develop a wearable textile sensor for EMG monitoring. The sensor was designed to be soft, stretchable, and washable, making it suitable for long-term use. The sensor was fabricated using a conductive thread material that was embroidered onto a fabric substrate. The sensor was then connected to a microcontroller unit (MCU) and a Wi-Fi-enabled module. The MCU was programmed to acquire the EMG signal and transmit it wirelessly to the Wi-Fi-enabled module. The Wi-Fi-enabled module then sent the signal to a server, where it could be accessed by a computer or smartphone. The sensor was able to successfully acquire and transmit EMG signals from a variety of muscles. The signal quality was comparable to that of commercial EMG sensors. The development of this sensor has the potential to improve the way EMG is used in a variety of settings. The sensor is soft, stretchable, and washable, making it suitable for long-term use. This makes it ideal for use in clinical settings, where patients may need to wear the sensor for extended periods of time. The sensor is also small and lightweight, making it ideal for use in sports medicine and research settings. The data for this study was collected from a group of healthy volunteers. The volunteers were asked to perform a series of muscle contractions while the EMG signal was recorded. The data was then analyzed to assess the performance of the sensor. The EMG signals were analyzed using a variety of methods, including time-domain analysis and frequency-domain analysis. The time-domain analysis was used to extract features such as the root mean square (RMS) and average rectified value (ARV). The frequency-domain analysis was used to extract features such as the power spectrum. The question addressed by this study was whether a wearable textile sensor could be developed that is soft, stretchable, and washable and that can successfully acquire and transmit EMG signals. The results of this study demonstrate that a wearable textile sensor can be developed that meets the requirements of being soft, stretchable, washable, and capable of acquiring and transmitting EMG signals. This sensor has the potential to improve the way EMG is used in a variety of settings.Keywords: EMG, electrode position, smart wearable, textile sensor, IoT, IoT-integrated textile sensor
Procedia PDF Downloads 7519679 Forecasting Direct Normal Irradiation at Djibouti Using Artificial Neural Network
Authors: Ahmed Kayad Abdourazak, Abderafi Souad, Zejli Driss, Idriss Abdoulkader Ibrahim
Abstract:
In this paper Artificial Neural Network (ANN) is used to predict the solar irradiation in Djibouti for the first Time that is useful to the integration of Concentrating Solar Power (CSP) and sites selections for new or future solar plants as part of solar energy development. An ANN algorithm was developed to establish a forward/reverse correspondence between the latitude, longitude, altitude and monthly solar irradiation. For this purpose the German Aerospace Centre (DLR) data of eight Djibouti sites were used as training and testing in a standard three layers network with the back propagation algorithm of Lavenber-Marquardt. Results have shown a very good agreement for the solar irradiation prediction in Djibouti and proves that the proposed approach can be well used as an efficient tool for prediction of solar irradiation by providing so helpful information concerning sites selection, design and planning of solar plants.Keywords: artificial neural network, solar irradiation, concentrated solar power, Lavenberg-Marquardt
Procedia PDF Downloads 35419678 An ANN-Based Predictive Model for Diagnosis and Forecasting of Hypertension
Authors: Obe Olumide Olayinka, Victor Balanica, Eugen Neagoe
Abstract:
The effects of hypertension are often lethal thus its early detection and prevention is very important for everybody. In this paper, a neural network (NN) model was developed and trained based on a dataset of hypertension causative parameters in order to forecast the likelihood of occurrence of hypertension in patients. Our research goal was to analyze the potential of the presented NN to predict, for a period of time, the risk of hypertension or the risk of developing this disease for patients that are or not currently hypertensive. The results of the analysis for a given patient can support doctors in taking pro-active measures for averting the occurrence of hypertension such as recommendations regarding the patient behavior in order to lower his hypertension risk. Moreover, the paper envisages a set of three example scenarios in order to determine the age when the patient becomes hypertensive, i.e. determine the threshold for hypertensive age, to analyze what happens if the threshold hypertensive age is set to a certain age and the weight of the patient if being varied, and, to set the ideal weight for the patient and analyze what happens with the threshold of hypertensive age.Keywords: neural network, hypertension, data set, training set, supervised learning
Procedia PDF Downloads 39219677 A Deep Learning Based Integrated Model For Spatial Flood Prediction
Authors: Vinayaka Gude Divya Sampath
Abstract:
The research introduces an integrated prediction model to assess the susceptibility of roads in a future flooding event. The model consists of deep learning algorithm for forecasting gauge height data and Flood Inundation Mapper (FIM) for spatial flooding. An optimal architecture for Long short-term memory network (LSTM) was identified for the gauge located on Tangipahoa River at Robert, LA. Dropout was applied to the model to evaluate the uncertainty associated with the predictions. The estimates are then used along with FIM to identify the spatial flooding. Further geoprocessing in ArcGIS provides the susceptibility values for different roads. The model was validated based on the devastating flood of August 2016. The paper discusses the challenges for generalization the methodology for other locations and also for various types of flooding. The developed model can be used by the transportation department and other emergency response organizations for effective disaster management.Keywords: deep learning, disaster management, flood prediction, urban flooding
Procedia PDF Downloads 14619676 Design and Evaluation of a Prototype for Non-Invasive Screening of Diabetes – Skin Impedance Technique
Authors: Pavana Basavakumar, Devadas Bhat
Abstract:
Diabetes is a disease which often goes undiagnosed until its secondary effects are noticed. Early detection of the disease is necessary to avoid serious consequences which could lead to the death of the patient. Conventional invasive tests for screening of diabetes are mostly painful, time consuming and expensive. There’s also a risk of infection involved, therefore it is very essential to develop non-invasive methods to screen and estimate the level of blood glucose. Extensive research is going on with this perspective, involving various techniques that explore optical, electrical, chemical and thermal properties of the human body that directly or indirectly depend on the blood glucose concentration. Thus, non-invasive blood glucose monitoring has grown into a vast field of research. In this project, an attempt was made to device a prototype for screening of diabetes by measuring electrical impedance of the skin and building a model to predict a patient’s condition based on the measured impedance. The prototype developed, passes a negligible amount of constant current (0.5mA) across a subject’s index finger through tetra polar silver electrodes and measures output voltage across a wide range of frequencies (10 KHz – 4 MHz). The measured voltage is proportional to the impedance of the skin. The impedance was acquired in real-time for further analysis. Study was conducted on over 75 subjects with permission from the institutional ethics committee, along with impedance, subject’s blood glucose values were also noted, using conventional method. Nonlinear regression analysis was performed on the features extracted from the impedance data to obtain a model that predicts blood glucose values for a given set of features. When the predicted data was depicted on Clarke’s Error Grid, only 58% of the values predicted were clinically acceptable. Since the objective of the project was to screen diabetes and not actual estimation of blood glucose, the data was classified into three classes ‘NORMAL FASTING’,’NORMAL POSTPRANDIAL’ and ‘HIGH’ using linear Support Vector Machine (SVM). Classification accuracy obtained was 91.4%. The developed prototype was economical, fast and pain free. Thus, it can be used for mass screening of diabetes.Keywords: Clarke’s error grid, electrical impedance of skin, linear SVM, nonlinear regression, non-invasive blood glucose monitoring, screening device for diabetes
Procedia PDF Downloads 32519675 The Impact of Shifting Trading Pattern from Long-Haul to Short-Sea to the Car Carriers’ Freight Revenues
Authors: Tianyu Wang, Nikita Karandikar
Abstract:
The uncertainty around cost, safety, and feasibility of the decarbonized shipping fuels has made it increasingly complex for the shipping companies to set pricing strategies and forecast their freight revenues going forward. The increase in the green fuel surcharges will ultimately influence the automobile’s consumer prices. The auto shipping demand (ton-miles) has been gradually shifting from long-haul to short-sea trade over the past years following the relocation of the original equipment manufacturer (OEM) manufacturing to regions such as South America and Southeast Asia. The objective of this paper is twofold: 1) to investigate the car-carriers freight revenue development over the years when the trade pattern is gradually shifting towards short-sea exports 2) to empirically identify the quantitative impact of such trade pattern shifting to mainly freight rate, but also vessel size, fleet size as well as Green House Gas (GHG) emission in Roll on-Roll Off (Ro-Ro) shipping. In this paper, a model of analyzing and forecasting ton-miles and freight revenues for the trade routes of AS-NA (Asia to North America), EU-NA (Europe to North America), and SA-NA (South America to North America) is established by deploying Automatic Identification System (AIS) data and the financial results of a selected car carrier company. More specifically, Wallenius Wilhelmsen Logistics (WALWIL), the Norwegian Ro-Ro carrier listed on Oslo Stock Exchange, is selected as the case study company in this paper. AIS-based ton-mile datasets of WALWIL vessels that are sailing into North America region from three different origins (Asia, Europe, and South America), together with WALWIL’s quarterly freight revenues as reported in trade segments, will be investigated and compared for the past five years (2018-2022). Furthermore, ordinary‐least‐square (OLS) regression is utilized to construct the ton-mile demand and freight revenue forecasting. The determinants of trade pattern shifting, such as import tariffs following the China-US trade war and fuel prices following the 0.1% Emission Control Areas (ECA) zone requirement after IMO2020 will be set as key variable inputs to the machine learning model. The model will be tested on another newly listed Norwegian Car Carrier, Hoegh Autoliner, to forecast its 2022 financial results and to validate the accuracy based on its actual results. GHG emissions on the three routes will be compared and discussed based on a constant emission per mile assumption and voyage distances. Our findings will provide important insights about 1) the trade-off evaluation between revenue reduction and energy saving with the new ton-mile pattern and 2) how the trade flow shifting would influence the future need for the vessel and fleet size.Keywords: AIS, automobile exports, maritime big data, trade flows
Procedia PDF Downloads 12119674 The Role of Metaheuristic Approaches in Engineering Problems
Authors: Ferzat Anka
Abstract:
Many types of problems can be solved using traditional analytical methods. However, these methods take a long time and cause inefficient use of resources. In particular, different approaches may be required in solving complex and global engineering problems that we frequently encounter in real life. The bigger and more complex a problem, the harder it is to solve. Such problems are called Nondeterministic Polynomial time (NP-hard) in the literature. The main reasons for recommending different metaheuristic algorithms for various problems are the use of simple concepts, the use of simple mathematical equations and structures, the use of non-derivative mechanisms, the avoidance of local optima, and their fast convergence. They are also flexible, as they can be applied to different problems without very specific modifications. Thanks to these features, it can be easily embedded even in many hardware devices. Accordingly, this approach can also be used in trend application areas such as IoT, big data, and parallel structures. Indeed, the metaheuristic approaches are algorithms that return near-optimal results for solving large-scale optimization problems. This study is focused on the new metaheuristic method that has been merged with the chaotic approach. It is based on the chaos theorem and helps relevant algorithms to improve the diversity of the population and fast convergence. This approach is based on Chimp Optimization Algorithm (ChOA), that is a recently introduced metaheuristic algorithm inspired by nature. This algorithm identified four types of chimpanzee groups: attacker, barrier, chaser, and driver, and proposed a suitable mathematical model for them based on the various intelligence and sexual motivations of chimpanzees. However, this algorithm is not more successful in the convergence rate and escaping of the local optimum trap in solving high-dimensional problems. Although it and some of its variants use some strategies to overcome these problems, it is observed that it is not sufficient. Therefore, in this study, a newly expanded variant is described. In the algorithm called Ex-ChOA, hybrid models are proposed for position updates of search agents, and a dynamic switching mechanism is provided for transition phases. This flexible structure solves the slow convergence problem of ChOA and improves its accuracy in multidimensional problems. Therefore, it tries to achieve success in solving global, complex, and constrained problems. The main contribution of this study is 1) It improves the accuracy and solves the slow convergence problem of the ChOA. 2) It proposes new hybrid movement strategy models for position updates of search agents. 3) It provides success in solving global, complex, and constrained problems. 4) It provides a dynamic switching mechanism between phases. The performance of the Ex-ChOA algorithm is analyzed on a total of 8 benchmark functions, as well as a total of 2 classical and constrained engineering problems. The proposed algorithm is compared with the ChoA, and several well-known variants (Weighted-ChoA, Enhanced-ChoA) are used. In addition, an Improved algorithm from the Grey Wolf Optimizer (I-GWO) method is chosen for comparison since the working model is similar. The obtained results depict that the proposed algorithm performs better or equivalently to the compared algorithms.Keywords: optimization, metaheuristic, chimp optimization algorithm, engineering constrained problems
Procedia PDF Downloads 7719673 Drawing, Design and Building Information Modelling (BIM): Embedding Advanced Digital Tools in the Academy Programs for Building Engineers and Architects
Authors: Vittorio Caffi, Maria Pignataro, Antonio Cosimo Devito, Marco Pesenti
Abstract:
This paper deals with the integration of advanced digital design and modelling tools and methodologies, known as Building Information Modelling, into the traditional Academy educational programs for building engineers and architects. Nowadays, the challenge the Academy has to face is to present the new tools and their features to the pupils, making sure they acquire the proper skills in order to leverage the potential they offer also for the other courses embedded in the educational curriculum. The syllabus here presented refers to the “Drawing for building engineering”, “2D and 3D laboratory” and “3D modelling” curricula of the MSc in Building Engineering of the Politecnico di Milano. Such topics, included since the first year in the MSc program, are fundamental to give the students the instruments to master the complexity of an architectural or building engineering project with digital tools, so as to represent it in its various forms.Keywords: BIM, BIM curricula, computational design, digital modelling
Procedia PDF Downloads 66919672 Analysis and Identification of Different Factors Affecting Students’ Performance Using a Correlation-Based Network Approach
Authors: Jeff Chak-Fu Wong, Tony Chun Yin Yip
Abstract:
The transition from secondary school to university seems exciting for many first-year students but can be more challenging than expected. Enabling instructors to know students’ learning habits and styles enhances their understanding of the students’ learning backgrounds, allows teachers to provide better support for their students, and has therefore high potential to improve teaching quality and learning, especially in any mathematics-related courses. The aim of this research is to collect students’ data using online surveys, to analyze students’ factors using learning analytics and educational data mining and to discover the characteristics of the students at risk of falling behind in their studies based on students’ previous academic backgrounds and collected data. In this paper, we use correlation-based distance methods and mutual information for measuring student factor relationships. We then develop a factor network using the Minimum Spanning Tree method and consider further study for analyzing the topological properties of these networks using social network analysis tools. Under the framework of mutual information, two graph-based feature filtering methods, i.e., unsupervised and supervised infinite feature selection algorithms, are used to analyze the results for students’ data to rank and select the appropriate subsets of features and yield effective results in identifying the factors affecting students at risk of failing. This discovered knowledge may help students as well as instructors enhance educational quality by finding out possible under-performers at the beginning of the first semester and applying more special attention to them in order to help in their learning process and improve their learning outcomes.Keywords: students' academic performance, correlation-based distance method, social network analysis, feature selection, graph-based feature filtering method
Procedia PDF Downloads 12919671 Juxtaposition of the Past and the Present: A Pragmatic Stylistic Analysis of the Short Story “Too Much Happiness” by Alice Munro
Authors: Inas Hussein
Abstract:
Alice Munro is a Canadian short-story writer who has been regarded as one of the greatest writers of fiction. Owing to her great contribution to fiction, she was the first Canadian woman and the only short-story writer ever to be rewarded the Nobel Prize for Literature in 2013. Her literary works include collections of short stories and one book published as a novel. Her stories concentrate on the human condition and the human relationships as seen through the lens of daily life. The setting in most of her stories is her native Canada- small towns much similar to the one where she grew up. Her writing style is not only realistic but is also characterized by autobiographical, historical and regional features. The aim of this research is to analyze one of the key stylistic devices often adopted by Munro in her fictions: the juxtaposition of the past and the present, with reference to the title story in Munro's short story collection Too Much Happiness. The story under exploration is a brief biography of the Russian Mathematician and novelist Sophia Kovalevsky (1850 – 1891), the first woman to be appointed as a professor of Mathematics at a European University in Stockholm. Thus, the story has a historical protagonist and is set on the European continent. Munro dramatizes the severe historical and cultural constraints that hindered the career of the protagonist. A pragmatic stylistic framework is being adopted and the qualitative analysis is supported by textual reference. The stylistic analysis reveals that the juxtaposition of the past and the present is one of the distinctive features that characterize the author; in a typical Munrovian manner, the protagonist often moves between the units of time: the past, the present and, sometimes, the future. Munro's style is simple and direct but cleverly constructed and densely complicated by the presence of deeper layers and stories within the story. Findings of the research reveal that the story under investigation merits reading and analyzing. It is recommended that this story and other stories by Munro are analyzed to further explore the features of her art and style.Keywords: Alice Munro, Too Much Happiness, style, stylistic analysis
Procedia PDF Downloads 14519670 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 66719669 The Review for Repair of Masonry Structures Using the Crack Stitching Technique
Authors: Sandile Daniel Ngidi
Abstract:
Masonry structures often crack due to different factors, which include differential movement of structures, thermal expansion, and seismic waves. Retrofitting is introduced to ensure that these cracks do not expand to a point of making the wall fail. Crack stitching is one of many repairing methods used to repair cracked masonry walls. It is done by stitching helical stainless steel reinforcement bars to reconnect and stabilize the wall. The basic element of this reinforcing system is the mechanical interlink between the helical stainless-steel bar and the grout, which makes it such a flexible and well-known masonry repair system. The objective of this review was to use previous experimental work done by different authors to check the efficiency and effectiveness of using the crack stitching technique to repair and stabilize masonry walls. The technique was found to be effective to rejuvenate the strength of a masonry structure to be stronger than initial strength. Different factors were investigated, which include economic features, sustainability, buildability, and suitability of this technique for application in developing communities.Keywords: brickforce, crack-stitching, masonry concrete, reinforcement, wall panels
Procedia PDF Downloads 17719668 On the Solution of Fractional-Order Dynamical Systems Endowed with Block Hybrid Methods
Authors: Kizito Ugochukwu Nwajeri
Abstract:
This paper presents a distinct approach to solving fractional dynamical systems using hybrid block methods (HBMs). Fractional calculus extends the concept of derivatives and integrals to non-integer orders and finds increasing application in fields such as physics, engineering, and finance. However, traditional numerical techniques often struggle to accurately capture the complex behaviors exhibited by these systems. To address this challenge, we develop HBMs that integrate single-step and multi-step methods, enabling the simultaneous computation of multiple solution points while maintaining high accuracy. Our approach employs polynomial interpolation and collocation techniques to derive a system of equations that effectively models the dynamics of fractional systems. We also directly incorporate boundary and initial conditions into the formulation, enhancing the stability and convergence properties of the numerical solution. An adaptive step-size mechanism is introduced to optimize performance based on the local behavior of the solution. Extensive numerical simulations are conducted to evaluate the proposed methods, demonstrating significant improvements in accuracy and efficiency compared to traditional numerical approaches. The results indicate that our hybrid block methods are robust and versatile, making them suitable for a wide range of applications involving fractional dynamical systems. This work contributes to the existing literature by providing an effective numerical framework for analyzing complex behaviors in fractional systems, thereby opening new avenues for research and practical implementation across various disciplines.Keywords: fractional calculus, numerical simulation, stability and convergence, Adaptive step-size mechanism, collocation methods
Procedia PDF Downloads 4319667 Examining the Role of Willingness to Communicate in Cross-Cultural Adaptation in East-Asia
Authors: Baohua Yu
Abstract:
Despite widely reported 'Mainland-Hong Kong conflicts', recent years have witnessed progressive growth in the numbers of Mainland Chinese students in Hong Kong’s universities. This research investigated Mainland Chinese students’ intercultural communication in relation to cross-cultural adaptation in a major university in Hong Kong. The features of intercultural communication examined in this study were competence in the second language (L2) communication and L2 Willingness to Communicate (WTC), while the features of cross-cultural adaptation examined were socio-cultural, psychological and academic adaptation. Based on a questionnaire, structural equation modelling was conducted among a sample of 196 Mainland Chinese students. Results showed that the competence in L2 communication played a significant role in L2 WTC, which had an influential effect on academic adaptation, which was itself identified as a mediator between the psychological adaptation and socio-cultural adaptation. Implications for curriculum design for courses and instructional practice on international students are discussed.Keywords: L2 willingness to communicate, competence in L2 communication, psychological adaptation, socio-cultural adaptation, academic adaptation, structural equation modelling
Procedia PDF Downloads 35519666 Multiresolution Mesh Blending for Surface Detail Reconstruction
Authors: Honorio Salmeron Valdivieso, Andy Keane, David Toal
Abstract:
In the area of mechanical reverse engineering, processes often encounter difficulties capturing small, highly localized surface information. This could be the case if a physical turbine was 3D scanned for lifecycle management or robust design purposes, with interest on eroded areas or scratched coating. The limitation partly is due to insufficient automated frameworks for handling -localized - surface information during the reverse engineering pipeline. We have developed a tool for blending surface patches with arbitrary irregularities into a base body (e.g. a CAD solid). The approach aims to transfer small surface features while preserving their shape and relative placement by using a multi-resolution scheme and rigid deformations. Automating this process enables the inclusion of outsourced surface information in CAD models, including samples prepared in mesh handling software, or raw scan information discarded in the early stages of reverse engineering reconstruction.Keywords: application lifecycle management, multiresolution deformation, reverse engineering, robust design, surface blending
Procedia PDF Downloads 13919665 On Block Vandermonde Matrix Constructed from Matrix Polynomial Solvents
Authors: Malika Yaici, Kamel Hariche
Abstract:
In control engineering, systems described by matrix fractions are studied through properties of block roots, also called solvents. These solvents are usually dealt with in a block Vandermonde matrix form. Inverses and determinants of Vandermonde matrices and block Vandermonde matrices are used in solving problems of numerical analysis in many domains but require costly computations. Even though Vandermonde matrices are well known and method to compute inverse and determinants are many and, generally, based on interpolation techniques, methods to compute the inverse and determinant of a block Vandermonde matrix have not been well studied. In this paper, some properties of these matrices and iterative algorithms to compute the determinant and the inverse of a block Vandermonde matrix are given. These methods are deducted from the partitioned matrix inversion and determinant computing methods. Due to their great size, parallelization may be a solution to reduce the computations cost, so a parallelization of these algorithms is proposed and validated by a comparison using algorithmic complexity.Keywords: block vandermonde matrix, solvents, matrix polynomial, matrix inverse, matrix determinant, parallelization
Procedia PDF Downloads 24019664 Impact of Civil Engineering and Economic Growth in the Sustainability of the Environment: Case of Albania
Authors: Rigers Dodaj
Abstract:
Nowadays, the environment is a critical goal for civil engineers, human activity, construction projects, economic growth, and whole national development. Regarding the development of Albania's economy, people's living standards are increasing, and the requirements for the living environment are also increasing. Under these circumstances, environmental protection and sustainability this is the critical issue. The rising industrialization, urbanization, and energy demand affect the environment by emission of carbon dioxide gas (CO2), a significant parameter known to impact air pollution directly. Consequently, many governments and international organizations conducted policies and regulations to address environmental degradation in the pursuit of economic development, for instance in Albania, the CO2 emission calculated in metric tons per capita has increased by 23% in the last 20 years. This paper analyzes the importance of civil engineering and economic growth in the sustainability of the environment focusing on CO2 emission. The analyzed data are time series 2001 - 2020 (with annual frequency), based on official publications of the World Bank. The statistical approach with vector error correction model and time series forecasting model are used to perform the parameter’s estimations and long-run equilibrium. The research in this paper adds a new perspective to the evaluation of a sustainable environment in the context of carbon emission reduction. Also, it provides reference and technical support for the government toward green and sustainable environmental policies. In the context of low-carbon development, effectively improving carbon emission efficiency is an inevitable requirement for achieving sustainable economic and environmental protection. Also, the study reveals that civil engineering development projects impact greatly the environment in the long run, especially in areas of flooding, noise pollution, water pollution, erosion, ecological disorder, natural hazards, etc. The potential for reducing industrial carbon emissions in recent years indicates that reduction is becoming more difficult, it needs another economic growth policy and more civil engineering development, by improving the level of industrialization and promoting technological innovation in industrial low-carbonization.Keywords: CO₂ emission, civil engineering, economic growth, environmental sustainability
Procedia PDF Downloads 8519663 Subjective Evaluation of Mathematical Morphology Edge Detection on Computed Tomography (CT) Images
Authors: Emhimed Saffor
Abstract:
In this paper, the problem of edge detection in digital images is considered. Three methods of edge detection based on mathematical morphology algorithm were applied on two sets (Brain and Chest) CT images. 3x3 filter for first method, 5x5 filter for second method and 7x7 filter for third method under MATLAB programming environment. The results of the above-mentioned methods are subjectively evaluated. The results show these methods are more efficient and satiable for medical images, and they can be used for different other applications.Keywords: CT images, Matlab, medical images, edge detection
Procedia PDF Downloads 338