Search results for: Scale-Invariant Feature Transformation (SIFT)
2680 High Sensitivity Crack Detection and Locating with Optimized Spatial Wavelet Analysis
Authors: A. Ghanbari Mardasi, N. Wu, C. Wu
Abstract:
In this study, a spatial wavelet-based crack localization technique for a thick beam is presented. Wavelet scale in spatial wavelet transformation is optimized to enhance crack detection sensitivity. A windowing function is also employed to erase the edge effect of the wavelet transformation, which enables the method to detect and localize cracks near the beam/measurement boundaries. Theoretical model and vibration analysis considering the crack effect are first proposed and performed in MATLAB based on the Timoshenko beam model. Gabor wavelet family is applied to the beam vibration mode shapes derived from the theoretical beam model to magnify the crack effect so as to locate the crack. Relative wavelet coefficient is obtained for sensitivity analysis by comparing the coefficient values at different positions of the beam with the lowest value in the intact area of the beam. Afterward, the optimal wavelet scale corresponding to the highest relative wavelet coefficient at the crack position is obtained for each vibration mode, through numerical simulations. The same procedure is performed for cracks with different sizes and positions in order to find the optimal scale range for the Gabor wavelet family. Finally, Hanning window is applied to different vibration mode shapes in order to overcome the edge effect problem of wavelet transformation and its effect on the localization of crack close to the measurement boundaries. Comparison of the wavelet coefficients distribution of windowed and initial mode shapes demonstrates that window function eases the identification of the cracks close to the boundaries.Keywords: edge effect, scale optimization, small crack locating, spatial wavelet
Procedia PDF Downloads 3572679 Russian Spatial Impersonal Sentence Models in Translation Perspective
Authors: Marina Fomina
Abstract:
The paper focuses on the category of semantic subject within the framework of a functional approach to linguistics. The semantic subject is related to similar notions such as the grammatical subject and the bearer of predicative feature. It is the multifaceted nature of the category of subject that 1) triggers a number of issues that, syntax-wise, remain to be dealt with (cf. semantic vs. syntactic functions / sentence parts vs. parts of speech issues, etc.); 2) results in a variety of approaches to the category of subject, such as formal grammatical, semantic/syntactic (functional), communicative approaches, etc. Many linguists consider the prototypical approach to the category of subject to be the most instrumental as it reveals the integrity of denotative and linguistic components of the conceptual category. This approach relates to subject as a source of non-passive predicative feature, an element of subject-predicate-object situation that can take on a variety of semantic roles, cf.: 1) an agent (He carefully surveyed the valley stretching before him), 2) an experiencer (I feel very bitter about this), 3) a recipient (I received this book as a gift), 4) a causee (The plane broke into three pieces), 5) a patient (This stove cleans easily), etc. It is believed that the variety of roles stems from the radial (prototypical) structure of the category with some members more central than others. Translation-wise, the most “treacherous” subject types are the peripheral ones. The paper 1) features a peripheral status of spatial impersonal sentence models such as U menia v ukhe zvenit (lit. I-Gen. in ear buzzes) within the category of semantic subject, 2) makes a structural and semantic analysis of the models, 3) focuses on their Russian-English translation patterns, 4) reveals non-prototypical features of subjects in the English equivalents.Keywords: bearer of predicative feature, grammatical subject, impersonal sentence model, semantic subject
Procedia PDF Downloads 3702678 Using Machine Learning Techniques for Autism Spectrum Disorder Analysis and Detection in Children
Authors: Norah Mohammed Alshahrani, Abdulaziz Almaleh
Abstract:
Autism Spectrum Disorder (ASD) is a condition related to issues with brain development that affects how a person recognises and communicates with others which results in difficulties with interaction and communication socially and it is constantly growing. Early recognition of ASD allows children to lead safe and healthy lives and helps doctors with accurate diagnoses and management of conditions. Therefore, it is crucial to develop a method that will achieve good results and with high accuracy for the measurement of ASD in children. In this paper, ASD datasets of toddlers and children have been analyzed. We employed the following machine learning techniques to attempt to explore ASD and they are Random Forest (RF), Decision Tree (DT), Na¨ıve Bayes (NB) and Support Vector Machine (SVM). Then Feature selection was used to provide fewer attributes from ASD datasets while preserving model performance. As a result, we found that the best result has been provided by the Support Vector Machine (SVM), achieving 0.98% in the toddler dataset and 0.99% in the children dataset.Keywords: autism spectrum disorder, machine learning, feature selection, support vector machine
Procedia PDF Downloads 1512677 Alternator Fault Detection Using Wigner-Ville Distribution
Authors: Amin Ranjbar, Amir Arsalan Jalili Zolfaghari, Amir Abolfazl Suratgar, Mehrdad Khajavi
Abstract:
This paper describes two stages of learning-based fault detection procedure in alternators. The procedure consists of three states of machine condition namely shortened brush, high impedance relay and maintaining a healthy condition in the alternator. The fault detection algorithm uses Wigner-Ville distribution as a feature extractor and also appropriate feature classifier. In this work, ANN (Artificial Neural Network) and also SVM (support vector machine) were compared to determine more suitable performance evaluated by the mean squared of errors criteria. Modules work together to detect possible faulty conditions of machines working. To test the method performance, a signal database is prepared by making different conditions on a laboratory setup. Therefore, it seems by implementing this method, satisfactory results are achieved.Keywords: alternator, artificial neural network, support vector machine, time-frequency analysis, Wigner-Ville distribution
Procedia PDF Downloads 3742676 Sustainable Lessons learnt from the attitudes of Language Instructors towards Computer Assisted Language Teaching (CALT)
Authors: Theophilus Adedokun, Sylvia Zulu, Felix Awung, Sam Usadolo
Abstract:
The proliferation of technology into teaching process has brought about transformation into the field of education. Language teaching is not left behind from this tremendous transformation which has drastically altered the teaching of language. It is, however, appalling that some language instructors seem to possess negative attitudes toward the use of technology in language teaching, which in this study is referred to as Computer Assisted Language Teaching (CALT). The purpose of this study, therefore, is to explore sustainable lesson that can be learnt from the attitudes of language instructors towards language teaching in some public universities. The knowledge gained from this study could inform and advance the use of Computer Assisted Language Teaching. This study considers the historical progression of CALT and recommends that a fundamental approach is required for institutions to develop and advance the use of CALT for teaching. A review of sustainable lessons learnt from the attitudes of language instructors towards CALT are provided, and the CALT experience of 3 institutions are described. Drawing from this succinct description, this study makes recommendations on how operative CALT could be executed on a personal and institutional basis.Keywords: attitudes, language instructors, sustainable lessons, computer assisted language teaching
Procedia PDF Downloads 862675 Reducing the Imbalance Penalty Through Artificial Intelligence Methods Geothermal Production Forecasting: A Case Study for Turkey
Authors: Hayriye Anıl, Görkem Kar
Abstract:
In addition to being rich in renewable energy resources, Turkey is one of the countries that promise potential in geothermal energy production with its high installed power, cheapness, and sustainability. Increasing imbalance penalties become an economic burden for organizations since geothermal generation plants cannot maintain the balance of supply and demand due to the inadequacy of the production forecasts given in the day-ahead market. A better production forecast reduces the imbalance penalties of market participants and provides a better imbalance in the day ahead market. In this study, using machine learning, deep learning, and, time series methods, the total generation of the power plants belonging to Zorlu Natural Electricity Generation, which has a high installed capacity in terms of geothermal, was estimated for the first one and two weeks of March, then the imbalance penalties were calculated with these estimates and compared with the real values. These modeling operations were carried out on two datasets, the basic dataset and the dataset created by extracting new features from this dataset with the feature engineering method. According to the results, Support Vector Regression from traditional machine learning models outperformed other models and exhibited the best performance. In addition, the estimation results in the feature engineering dataset showed lower error rates than the basic dataset. It has been concluded that the estimated imbalance penalty calculated for the selected organization is lower than the actual imbalance penalty, optimum and profitable accounts.Keywords: machine learning, deep learning, time series models, feature engineering, geothermal energy production forecasting
Procedia PDF Downloads 1102674 A Dynamic Software Product Line Approach to Self-Adaptive Genetic Algorithms
Authors: Abdelghani Alidra, Mohamed Tahar Kimour
Abstract:
Genetic algorithm must adapt themselves at design time to cope with the search problem specific requirements and at runtime to balance exploration and convergence objectives. In a previous article, we have shown that modeling and implementing Genetic Algorithms (GA) using the software product line (SPL) paradigm is very appreciable because they constitute a product family sharing a common base of code. In the present article we propose to extend the use of the feature model of the genetic algorithms family to model the potential states of the GA in what is called a Dynamic Software Product Line. The objective of this paper is the systematic generation of a reconfigurable architecture that supports the dynamic of the GA and which is easily deduced from the feature model. The resultant GA is able to perform dynamic reconfiguration autonomously to fasten the convergence process while producing better solutions. Another important advantage of our approach is the exploitation of recent advances in the domain of dynamic SPLs to enhance the performance of the GAs.Keywords: self-adaptive genetic algorithms, software engineering, dynamic software product lines, reconfigurable architecture
Procedia PDF Downloads 2852673 The Impacts of New Digital Technology Transformation on Singapore Healthcare Sector: Case Study of a Public Hospital in Singapore from a Management Accounting Perspective
Authors: Junqi Zou
Abstract:
As one of the world’s most tech-ready countries, Singapore has initiated the Smart Nation plan to harness the full power and potential of digital technologies to transform the way people live and work, through the more efficient government and business processes, to make the economy more productive. The key evolutions of digital technology transformation in healthcare and the increasing deployment of Internet of Things (IoTs), Big Data, AI/cognitive, Robotic Process Automation (RPA), Electronic Health Record Systems (EHR), Electronic Medical Record Systems (EMR), Warehouse Management System (WMS in the most recent decade have significantly stepped up the move towards an information-driven healthcare ecosystem. The advances in information technology not only bring benefits to patients but also act as a key force in changing management accounting in healthcare sector. The aim of this study is to investigate the impacts of digital technology transformation on Singapore’s healthcare sector from a management accounting perspective. Adopting a Balanced Scorecard (BSC) analysis approach, this paper conducted an exploratory case study of a newly launched Singapore public hospital, which has been recognized as amongst the most digitally advanced healthcare facilities in Asia-Pacific region. Specifically, this study gains insights on how the new technology is changing healthcare organizations’ management accounting from four perspectives under the Balanced Scorecard approach, 1) Financial Perspective, 2) Customer (Patient) Perspective, 3) Internal Processes Perspective, and 4) Learning and Growth Perspective. Based on a thorough review of archival records from the government and public, and the interview reports with the hospital’s CIO, this study finds the improvements from all the four perspectives under the Balanced Scorecard framework as follows: 1) Learning and Growth Perspective: The Government (Ministry of Health) works with the hospital to open up multiple training pathways to health professionals that upgrade and develops new IT skills among the healthcare workforce to support the transformation of healthcare services. 2) Internal Process Perspective: The hospital achieved digital transformation through Project OneCare to integrate clinical, operational, and administrative information systems (e.g., EHR, EMR, WMS, EPIB, RTLS) that enable the seamless flow of data and the implementation of JIT system to help the hospital operate more effectively and efficiently. 3) Customer Perspective: The fully integrated EMR suite enhances the patient’s experiences by achieving the 5 Rights (Right Patient, Right Data, Right Device, Right Entry and Right Time). 4) Financial Perspective: Cost savings are achieved from improved inventory management and effective supply chain management. The use of process automation also results in a reduction of manpower costs and logistics cost. To summarize, these improvements identified under the Balanced Scorecard framework confirm the success of utilizing the integration of advanced ICT to enhance healthcare organization’s customer service, productivity efficiency, and cost savings. Moreover, the Big Data generated from this integrated EMR system can be particularly useful in aiding management control system to optimize decision making and strategic planning. To conclude, the new digital technology transformation has moved the usefulness of management accounting to both financial and non-financial dimensions with new heights in the area of healthcare management.Keywords: balanced scorecard, digital technology transformation, healthcare ecosystem, integrated information system
Procedia PDF Downloads 1612672 Curriculum Transformation: Multidisciplinary Perspectives on ‘Decolonisation’ and ‘Africanisation’ of the Curriculum in South Africa’s Higher Education
Authors: Andre Bechuke
Abstract:
The years of 2015-2017 witnessed a huge campaign, and in some instances, violent protests in South Africa by students and some groups of academics advocating the decolonisation of the curriculum of universities. These protests have forced through high expectations for universities to teach a curriculum relevant to the country, and the continent as well as enabled South Africa to participate in the globalised world. To realise this purpose, most universities are currently undertaking steps to transform and decolonise their curriculum. However, the transformation process is challenged and delayed by lack of a collective understanding of the concepts ‘decolonisation’ and ‘africanisation’ that should guide its application. Even more challenging is lack of a contextual understanding of these concepts across different university disciplines. Against this background, and underpinned in a qualitative research paradigm, the perspectives of these concepts as applied by different university disciplines were examined in order to understand and establish their implementation in the curriculum transformation agenda. Data were collected by reviewing the teaching and learning plans of 8 faculties of an institution of higher learning in South Africa and analysed through content and textual analysis. The findings revealed varied understanding and use of these concepts in the transformation of the curriculum across faculties. Decolonisation, according to the faculties of Law and Humanities, is perceived as the eradication of the Eurocentric positioning in curriculum content and the constitutive rules and norms that control thinking. This is not done by ignoring other knowledge traditions but does call for an affirmation and validation of African views of the world and systems of thought, mixing it with current knowledge. For the Faculty of Natural and Agricultural Sciences, decolonisation is seen as making the content of the curriculum relevant to students, fulfilling the needs of industry and equipping students for job opportunities. This means the use of teaching strategies and methods that are inclusive of students from diverse cultures, and to structure the learning experience in ways that are not alien to the cultures of the students. For the Health Sciences, decolonisation of the curriculum refers to the need for a shift in Western thinking towards being more sensitive to all cultural beliefs and thoughts. Collectively, decolonisation of education thus entails that a nation must become independent with regard to the acquisition of knowledge, skills, values, beliefs, and habits. Based on the findings, for universities to successfully transform their curriculum and integrate the concepts of decolonisation and Africanisation, there is a need to contextually determine the meaning of the concepts generally and narrow them down to what they should mean to specific disciplines. Universities should refrain from considering an umbrella approach to these concepts. Decolonisation should be seen as a means and not an end. A decolonised curriculum should equally be developed based on the finest knowledge skills, values, beliefs and habits around the world and not limited to one country or continent.Keywords: Africanisation, curriculum, transformation, decolonisation, multidisciplinary perspectives, South Africa’s higher education
Procedia PDF Downloads 1602671 The Effects of Urbanization on Peri-Urban Livelihood in Ghana: A Case of Kumasi Peri-Urban Communities
Authors: Charles Kwaku Oppong
Abstract:
The research linked urban expansion resulting from urbanization with changing morphology processes happening in peri-urban communities. Two villages of Kumasi City peri-urban were used as a case study. Appropriate analytical framework and methodology (literature review and empirical evidence) were employed to ensure that all pertinent issues of peri-urban interface are brought to light. It was discovered from the study that since peri-urban livelihood is linked with assets base; it has been found that stock of asset, as well as transformation processes, were major factors in the shaping of livelihoods strategies. For that reason, success or failure of household livelihoods was seen to relate to the kind of livelihood strategy employed. With efforts to mitigate for livelihoods failure due to peri-urban development, households' recourse to remittances, land disposal, and other means as an alternative livelihood approach. The study calls for local government policy interventions in regulating peri-urban transformation process and providing safety nets for the vulnerable.Keywords: urban expansion, peri-urban interface, livelihoods, asset
Procedia PDF Downloads 2652670 Proactive Change or Adaptive Response: A Study on the Impact of Digital Transformation Strategy Modes on Enterprise Profitability From a Configuration Perspective
Authors: Jing-Ma
Abstract:
Digital transformation (DT) is an important way for manufacturing enterprises to shape new competitive advantages, and how to choose an effective DT strategy is crucial for enterprise growth and sustainable development. Rooted in strategic change theory, this paper incorporates the dimensions of managers' digital cognition, organizational conditions, and external environment into the same strategic analysis framework and integrates the dynamic QCA method and PSM method to study the antecedent grouping of the DT strategy mode of manufacturing enterprises and its impact on corporate profitability based on the data of listed manufacturing companies in China from 2015 to 2019. We find that the synergistic linkage of different dimensional elements can form six equivalent paths of high-level DT, which can be summarized as the proactive change mode of resource-capability dominated as well as adaptive response mode such as industry-guided resource replenishment. Capacity building under complex environments, market-industry synergy-driven, forced adaptation under peer pressure, and the managers' digital cognition play a non-essential but crucial role in this process. Except for individual differences in the market industry collaborative driving mode, other modes are more stable in terms of individual and temporal changes. However, it is worth noting that not all paths that result in high levels of DT can contribute to enterprise profitability, but only high levels of DT that result from matching the optimization of internal conditions with the external environment, such as industry technology and macro policies, can have a significant positive impact on corporate profitability.Keywords: digital transformation, strategy mode, enterprise profitability, dynamic QCA, PSM approach
Procedia PDF Downloads 242669 Effect of Co-doping on Polycrystalline Ni-Mn-Ga
Authors: Mahsa Namvari, Kari Ullakko
Abstract:
It is well-known that the Co-doping of ferromagnetic shape memory alloys (FSMAs) is a crucial tool to control their multifunctional properties. The present work investigates the use of small quantities of Co to fine-tune the transformation, structure, microstructure, mechanical and magnetic properties of the polycrystalline Ni₄₉.₈Mn₂₈.₅Ga₂₁.₇ (at.%) alloy, At Co concentrations of 1-1.5 at.%, a microstructure with an average grain size of about 2.00 mm was formed with a twin structure, enabling the experimental observation of magnetic-field-induced twin variant rearrangement. At higher levels of Co-doping, the grain size was essentially reduced, and the crystal structure of the martensitic phase became 2M martensite. The decreasing grain size and changing crystal structure are attributed to the progress of γ-phase precipitates. Alongside the academic aspect, the results of the present work point to the commercial advantage of fabricating 10M Co-doped Ni-Mn-Ga actuating elements made from large grains of polycrystalline ingots obtained by a standard melting facility instead of grown single crystals.Keywords: Ni-Mn-Ga, ferromagnetic shape memory, martensitic phase transformation, grain growth
Procedia PDF Downloads 952668 A Framework for Auditing Multilevel Models Using Explainability Methods
Authors: Debarati Bhaumik, Diptish Dey
Abstract:
Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics
Procedia PDF Downloads 942667 Machine Vision System for Measuring the Quality of Bulk Sun-dried Organic Raisins
Authors: Navab Karimi, Tohid Alizadeh
Abstract:
An intelligent vision-based system was designed to measure the quality and purity of raisins. A machine vision setup was utilized to capture the images of bulk raisins in ranges of 5-50% mixed pure-impure berries. The textural features of bulk raisins were extracted using Grey-level Histograms, Co-occurrence Matrix, and Local Binary Pattern (a total of 108 features). Genetic Algorithm and neural network regression were used for selecting and ranking the best features (21 features). As a result, the GLCM features set was found to have the highest accuracy (92.4%) among the other sets. Followingly, multiple feature combinations of the previous stage were fed into the second regression (linear regression) to increase accuracy, wherein a combination of 16 features was found to be the optimum. Finally, a Support Vector Machine (SVM) classifier was used to differentiate the mixtures, producing the best efficiency and accuracy of 96.2% and 97.35%, respectively.Keywords: sun-dried organic raisin, genetic algorithm, feature extraction, ann regression, linear regression, support vector machine, south azerbaijan.
Procedia PDF Downloads 732666 Integrated Intensity and Spatial Enhancement Technique for Color Images
Authors: Evan W. Krieger, Vijayan K. Asari, Saibabu Arigela
Abstract:
Video imagery captured for real-time security and surveillance applications is typically captured in complex lighting conditions. These less than ideal conditions can result in imagery that can have underexposed or overexposed regions. It is also typical that the video is too low in resolution for certain applications. The purpose of security and surveillance video is that we should be able to make accurate conclusions based on the images seen in the video. Therefore, if poor lighting and low resolution conditions occur in the captured video, the ability to make accurate conclusions based on the received information will be reduced. We propose a solution to this problem by using image preprocessing to improve these images before use in a particular application. The proposed algorithm will integrate an intensity enhancement algorithm with a super resolution technique. The intensity enhancement portion consists of a nonlinear inverse sign transformation and an adaptive contrast enhancement. The super resolution section is a single image super resolution technique is a Fourier phase feature based method that uses a machine learning approach with kernel regression. The proposed technique intelligently integrates these algorithms to be able to produce a high quality output while also being more efficient than the sequential use of these algorithms. This integration is accomplished by performing the proposed algorithm on the intensity image produced from the original color image. After enhancement and super resolution, a color restoration technique is employed to obtain an improved visibility color image.Keywords: dynamic range compression, multi-level Fourier features, nonlinear enhancement, super resolution
Procedia PDF Downloads 5542665 Thermo-Mechanical Processing of Armor Steel Plates
Authors: Taher El-Bitar, Maha El-Meligy, Eman El-Shenawy, Almosilhy Almosilhy, Nader Dawood
Abstract:
The steel contains 0.3% C and 0.004% B, beside Mn, Cr, Mo, and Ni. The alloy was processed by using 20-ton capacity electric arc furnace (EAF), and then refined by ladle furnace (LF). Liquid steel was cast as rectangular ingots. Dilatation test showed the critical transformation temperatures Ac1, Ac3, Ms and Mf as 716, 835, 356, and 218 °C. The ingots were austenitized and soaked and then rough rolled to thin slabs with 80 mm thickness. The thin slabs were then reheated and soaked for finish rolling to 6.0 mm thickness plates. During the rough rolling, the roll force increases as a result of rolling at temperatures less than recrystallization temperature. However, during finish rolling, the steel reflects initially continuous static recrystallization after which it shows strain hardening due to fall of temperature. It was concluded that, the steel plates were successfully heat treated by quenching-tempering at 250 ºC for 20 min.Keywords: armor steel, austenitizing, critical transformation temperatures (CTTs), dilatation curve, martensite, quenching, rough and finish rolling processes, soaking, tempering, thermo-mechanical processing
Procedia PDF Downloads 3472664 The Customization of 3D Last Form Design Based on Weighted Blending
Authors: Shih-Wen Hsiao, Chu-Hsuan Lee, Rong-Qi Chen
Abstract:
When it comes to last, it is regarded as the critical foundation of shoe design and development. Not only the last relates to the comfort of shoes wearing but also it aids the production of shoe styling and manufacturing. In order to enhance the efficiency and application of last development, a computer aided methodology for customized last form designs is proposed in this study. The reverse engineering is mainly applied to the process of scanning for the last form. Then the minimum energy is used for the revision of surface continuity, the surface of the last is reconstructed with the feature curves of the scanned last. When the surface of a last is reconstructed, based on the foundation of the proposed last form reconstruction module, the weighted arithmetic mean method is applied to the calculation on the shape morphing which differs from the grading for the control mesh of last, and the algorithm of subdivision is used to create the surface of last mesh, thus the feet-fitting 3D last form of different sizes is generated from its original form feature with functions remained. Finally, the practicability of the proposed methodology is verified through later case studies.Keywords: 3D last design, customization, reverse engineering, weighted morphing, shape blending
Procedia PDF Downloads 3392663 Cooling Profile Analysis of Hot Strip Coil Using Finite Volume Method
Authors: Subhamita Chakraborty, Shubhabrata Datta, Sujay Kumar Mukherjea, Partha Protim Chattopadhyay
Abstract:
Manufacturing of multiphase high strength steel in hot strip mill have drawn significant attention due to the possibility of forming low temperature transformation product of austenite under continuous cooling condition. In such endeavor, reliable prediction of temperature profile of hot strip coil is essential in order to accesses the evolution of microstructure at different location of hot strip coil, on the basis of corresponding Continuous Cooling Transformation (CCT) diagram. Temperature distribution profile of the hot strip coil has been determined by using finite volume method (FVM) vis-à-vis finite difference method (FDM). It has been demonstrated that FVM offer greater computational reliability in estimation of contact pressure distribution and hence the temperature distribution for curved and irregular profiles, owing to the flexibility in selection of grid geometry and discrete point position, Moreover, use of finite volume concept allows enforcing the conservation of mass, momentum and energy, leading to enhanced accuracy of prediction.Keywords: simulation, modeling, thermal analysis, coil cooling, contact pressure, finite volume method
Procedia PDF Downloads 4722662 Analyzing Apposition and the Typology of Specific Reference in Newspaper Discourse in Nigeria
Authors: Monday Agbonica Bello Eje
Abstract:
The language of the print media is characterized by the use of apposition. This linguistic element function strategically in journalistic discourse where it is communicatively necessary to name individuals and provide information about them. Linguistic studies on the language of the print media with bias for apposition have largely dwelt on other areas but the examination of the typology of appositive reference in newspaper discourse. Yet, it is capable of revealing ways writers communicate and provide information necessary for readers to follow and understand the message. The study, therefore, analyses the patterns of appositional occurrences and the typology of reference in newspaper articles. The data were obtained from The Punch and Daily Trust Newspapers. A total of six editions of these newspapers were collected randomly spread over three months. News and feature articles were used in the analysis. Guided by the referential theory of meaning in discourse, the appositions identified were subjected to analysis. The findings show that the semantic relation of coreference and speaker coreference have the highest percentage and frequency of occurrence in the data. This is because the subject matter of news reports and feature articles focuses on humans and the events around them; as a result, readers need to be provided with some form of detail and background information in order to identify as well as follow the discourse. Also, the non-referential relation of absolute synonymy and speaker synonymy no doubt have fewer occurrences and percentages in the analysis. This is tied to a major feature of the language of the media: simplicity. The paper concludes that appositions is mainly used for the purpose of providing the reader with much detail. In this way, the writer transmits information which helps him not only to give detailed yet concise descriptions but also in some way help the reader to follow the discourse.Keywords: apposition, discourse, newspaper, Nigeria, reference
Procedia PDF Downloads 1732661 Structural Parameter-Induced Focusing Pattern Transformation in CEA Microfluidic Device
Authors: Xin Shi, Wei Tan, Guorui Zhu
Abstract:
The contraction-expansion array (CEA) microfluidic device is widely used for particle focusing and particle separation. Without the introduction of external fields, it can manipulate particles using hydrodynamic forces, including inertial lift forces and Dean drag forces. The focusing pattern of the particles in a CEA channel can be affected by the structural parameter, block ratio, and flow streamlines. Here, two typical focusing patterns with five different structural parameters were investigated, and the force mechanism was analyzed. We present nine CEA channels with different aspect ratios based on the process of changing the particle equilibrium positions. The results show that 10-15 μm particles have the potential to generate a side focusing line as the structural parameter (¬R𝓌) increases. For a determined channel structure and target particles, when the Reynolds number (Rₑ) exceeds the critical value, the focusing pattern will transform from a single pattern to a double pattern. The parameter α/R𝓌 can be used to calculate the critical Reynolds number for the focusing pattern transformation. The results can provide guidance for microchannel design and biomedical analysis.Keywords: microfluidic, inertial focusing, particle separation, Dean flow
Procedia PDF Downloads 792660 Rogue Waves Arising on the Standing Periodic Wave in the High-Order Ablowitz-Ladik Equation
Authors: Yanpei Zhen
Abstract:
The nonlinear Schrödinger (NLS) equation models wave dynamics in many physical problems related to fluids, plasmas, and optics. The standing periodic waves are known to be modulationally unstable, and rogue waves (localized perturbations in space and time) have been observed on their backgrounds in numerical experiments. The exact solutions for rogue waves arising on the periodic standing waves have been obtained analytically. It is natural to ask if the rogue waves persist on the standing periodic waves in the integrable discretizations of the integrable NLS equation. We study the standing periodic waves in the semidiscrete integrable system modeled by the high-order Ablowitz-Ladik (AL) equation. The standing periodic wave of the high-order AL equation is expressed by the Jacobi cnoidal elliptic function. The exact solutions are obtained by using the separation of variables and one-fold Darboux transformation. Since the cnoidal wave is modulationally unstable, the rogue waves are generated on the periodic background.Keywords: Darboux transformation, periodic wave, Rogue wave, separating the variables
Procedia PDF Downloads 1832659 Capturing the Stress States in Video Conferences by Photoplethysmographic Pulse Detection
Authors: Jarek Krajewski, David Daxberger
Abstract:
We propose a stress detection method based on an RGB camera using heart rate detection, also known as Photoplethysmography Imaging (PPGI). This technique focuses on the measurement of the small changes in skin colour caused by blood perfusion. A stationary lab setting with simulated video conferences is chosen using constant light conditions and a sampling rate of 30 fps. The ground truth measurement of heart rate is conducted with a common PPG system. The proposed approach for pulse peak detection is based on a machine learning-based approach, applying brute force feature extraction for the prediction of heart rate pulses. The statistical analysis showed good agreement (correlation r = .79, p<0.05) between the reference heart rate system and the proposed method. Based on these findings, the proposed method could provide a reliable, low-cost, and contactless way of measuring HR parameters in daily-life environments.Keywords: heart rate, PPGI, machine learning, brute force feature extraction
Procedia PDF Downloads 1232658 Machine Learning for Feature Selection and Classification of Systemic Lupus Erythematosus
Authors: H. Zidoum, A. AlShareedah, S. Al Sawafi, A. Al-Ansari, B. Al Lawati
Abstract:
Systemic lupus erythematosus (SLE) is an autoimmune disease with genetic and environmental components. SLE is characterized by a wide variability of clinical manifestations and a course frequently subject to unpredictable flares. Despite recent progress in classification tools, the early diagnosis of SLE is still an unmet need for many patients. This study proposes an interpretable disease classification model that combines the high and efficient predictive performance of CatBoost and the model-agnostic interpretation tools of Shapley Additive exPlanations (SHAP). The CatBoost model was trained on a local cohort of 219 Omani patients with SLE as well as other control diseases. Furthermore, the SHAP library was used to generate individual explanations of the model's decisions as well as rank clinical features by contribution. Overall, we achieved an AUC score of 0.945, F1-score of 0.92 and identified four clinical features (alopecia, renal disorders, cutaneous lupus, and hemolytic anemia) along with the patient's age that was shown to have the greatest contribution on the prediction.Keywords: feature selection, classification, systemic lupus erythematosus, model interpretation, SHAP, Catboost
Procedia PDF Downloads 832657 Coastal Modelling Studies for Jumeirah First Beach Stabilization
Authors: Zongyan Yang, Gagan K. Jena, Sankar B. Karanam, Noora M. A. Hokal
Abstract:
Jumeirah First beach, a segment of coastline of length 1.5 km, is one of the popular public beaches in Dubai, UAE. The stability of the beach has been affected by several coastal developmental projects, including The World, Island 2 and La Mer. A comprehensive stabilization scheme comprising of two composite groynes (of lengths 90 m and 125m), modification to the northern breakwater of Jumeirah Fishing Harbour and beach re-nourishment was implemented by Dubai Municipality in 2012. However, the performance of the implemented stabilization scheme has been compromised by La Mer project (built in 2016), which modified the wave climate at the Jumeirah First beach. The objective of the coastal modelling studies is to establish design basis for further beach stabilization scheme(s). Comprehensive coastal modelling studies had been conducted to establish the nearshore wave climate, equilibrium beach orientations and stable beach plan forms. Based on the outcomes of the modeling studies, recommendation had been made to extend the composite groynes to stabilize the Jumeirah First beach. Wave transformation was performed following an interpolation approach with wave transformation matrixes derived from simulations of a possible range of wave conditions in the region. The Dubai coastal wave model is developed with MIKE21 SW. The offshore wave conditions were determined from PERGOS wave data at 4 offshore locations with consideration of the spatial variation. The lateral boundary conditions corresponding to the offshore conditions, at Dubai/Abu Dhabi and Dubai Sharjah borders, were derived with application of LitDrift 1D wave transformation module. The Dubai coastal wave model was calibrated with wave records at monitoring stations operated by Dubai Municipality. The wave transformation matrix approach was validated with nearshore wave measurement at a Dubai Municipality monitoring station in the vicinity of the Jumeirah First beach. One typical year wave time series was transformed to 7 locations in front of the beach to count for the variation of wave conditions which are affected by adjacent and offshore developments. Equilibrium beach orientations were estimated with application of LitDrift by finding the beach orientations with null annual littoral transport at the 7 selected locations. The littoral transport calculation results were compared with beach erosion/accretion quantities estimated from the beach monitoring program (twice a year including bathymetric and topographical surveys). An innovative integral method was developed to outline the stable beach plan forms from the estimated equilibrium beach orientations, with predetermined minimum beach width. The optimal lengths for the composite groyne extensions were recommended based on the stable beach plan forms.Keywords: composite groyne, equilibrium beach orientation, stable beach plan form, wave transformation matrix
Procedia PDF Downloads 2632656 Transforming Maternity and Neonatal Services in a Middle Eastern Country
Authors: M. A. Brown, K. Hugill, D. Meredith
Abstract:
Since the establishment of midwifery, as a professional identity in its own right, in the early years of the 20th century, midwifery-led models of childbirth have prevailed in many parts of the world. However, in many locations midwives’ scope of practice remains underdeveloped or absent. In Qatar, all births take place in hospital and are under the professional jurisdiction of obstetricians, predominately supported by internationally trained nurse-midwives and obstetric nurses. The strategic vision for health services in Qatar endorsed a desire to provide women with the ‘Best Care Always’ and the introduction of midwifery was seen as a way to achieve this. In 2015 the process of recruiting postgraduate educated Clinical Midwife Specialists from international sources began. The midwives were brought together to initiate an in hospital and community service transformation plan. This plan set out a series of wide-ranging actions to transform maternity and neonatal services to make care safer and give women more health choices. Change in any organization is a complex and dynamic process. This is made even more complex when multifaceted professional and cross cultural factors are involved. This presentation reports upon the motivations and challenges that exist and the progress around introducing a multicultural midwifery model of childbirth care in the state of Qatar. The paper examines and reflects upon the drivers and unique features of childbirth in the country. Despite accomplishments, progress still needs to be made in order to fully implement sustainable changes to further improve care and ensure women and neonates get the ‘Best Care Always’. The progress within the transformation plan highlights how midwifery may coexist with competing models of maternity care to create an innovative, eclectic and culturally sensitive paradigm that can best serve women and neonatal health needs.Keywords: culture, managing change, midwifery, neonatal, service transformation plan
Procedia PDF Downloads 1482655 Feature Extraction Based on Contourlet Transform and Log Gabor Filter for Detection of Ulcers in Wireless Capsule Endoscopy
Authors: Nimisha Elsa Koshy, Varun P. Gopi, V. I. Thajudin Ahamed
Abstract:
The entire visualization of GastroIntestinal (GI) tract is not possible with conventional endoscopic exams. Wireless Capsule Endoscopy (WCE) is a low risk, painless, noninvasive procedure for diagnosing diseases such as bleeding, polyps, ulcers, and Crohns disease within the human digestive tract, especially the small intestine that was unreachable using the traditional endoscopic methods. However, analysis of massive images of WCE detection is tedious and time consuming to physicians. Hence, researchers have developed software methods to detect these diseases automatically. Thus, the effectiveness of WCE can be improved. In this paper, a novel textural feature extraction method is proposed based on Contourlet transform and Log Gabor filter to distinguish ulcer regions from normal regions. The results show that the proposed method performs well with a high accuracy rate of 94.16% using Support Vector Machine (SVM) classifier in HSV colour space.Keywords: contourlet transform, log gabor filter, ulcer, wireless capsule endoscopy
Procedia PDF Downloads 5402654 Hybrid Deep Learning and FAST-BRISK 3D Object Detection Technique for Bin-Picking Application
Authors: Thanakrit Taweesoontorn, Sarucha Yanyong, Poom Konghuayrob
Abstract:
Robotic arms have gained popularity in various industries due to their accuracy and efficiency. This research proposes a method for bin-picking tasks using the Cobot, combining the YOLOv5 CNNs model for object detection and pose estimation with traditional feature detection (FAST), feature description (BRISK), and matching algorithms. By integrating these algorithms and utilizing a small-scale depth sensor camera for capturing depth and color images, the system achieves real-time object detection and accurate pose estimation, enabling the robotic arm to pick objects correctly in both position and orientation. Furthermore, the proposed method is implemented within the ROS framework to provide a seamless platform for robotic control and integration. This integration of robotics, cameras, and AI technology contributes to the development of industrial robotics, opening up new possibilities for automating challenging tasks and improving overall operational efficiency.Keywords: robotic vision, image processing, applications of robotics, artificial intelligent
Procedia PDF Downloads 962653 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models
Authors: Danielle Shackley, Yetunde Folajimi
Abstract:
As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model
Procedia PDF Downloads 972652 Algorithm Research on Traffic Sign Detection Based on Improved EfficientDet
Authors: Ma Lei-Lei, Zhou You
Abstract:
Aiming at the problems of low detection accuracy of deep learning algorithm in traffic sign detection, this paper proposes improved EfficientDet based traffic sign detection algorithm. Multi-head self-attention is introduced in the minimum resolution layer of the backbone of EfficientDet to achieve effective aggregation of local and global depth information, and this study proposes an improved feature fusion pyramid with increased vertical cross-layer connections, which improves the performance of the model while introducing a small amount of complexity, the Balanced L1 Loss is introduced to replace the original regression loss function Smooth L1 Loss, which solves the problem of balance in the loss function. Experimental results show, the algorithm proposed in this study is suitable for the task of traffic sign detection. Compared with other models, the improved EfficientDet has the best detection accuracy. Although the test speed is not completely dominant, it still meets the real-time requirement.Keywords: convolutional neural network, transformer, feature pyramid networks, loss function
Procedia PDF Downloads 972651 Chemical and Physical Properties and Biocompatibility of Ti–6Al–4V Produced by Electron Beam Rapid Manufacturing and Selective Laser Melting for Biomedical Applications
Authors: Bing–Jing Zhao, Chang-Kui Liu, Hong Wang, Min Hu
Abstract:
Electron beam rapid manufacturing (EBRM) or Selective laser melting is an additive manufacturing process that uses 3D CAD data as a digital information source and energy in the form of a high-power laser beam or electron beam to create three-dimensional metal parts by fusing fine metallic powders together.Object:The present study was conducted to evaluate the mechanical properties ,the phase transformation,the corrosivity and the biocompatibility of Ti-6Al-4V by EBRM,SLM and forging technique.Method: Ti-6Al-4V alloy standard test pieces were manufactured by EBRM, SLM and forging technique according to AMS4999,GB/T228 and ISO 10993.The mechanical properties were analyzed by universal test machine. The phase transformation was analyzed by X-ray diffraction and scanning electron microscopy. The corrosivity was analyzed by electrochemical method. The biocompatibility was analyzed by co-culturing with mesenchymal stem cell and analyzed by scanning electron microscopy (SEM) and alkaline phosphatase assay (ALP) to evaluate cell adhesion and differentiation, respectively. Results: The mechanical properties, the phase transformation, the corrosivity and the biocompatibility of Ti-6Al-4V by EBRM、SLM were similar to forging and meet the mechanical property requirements of AMS4999 standard. aphase microstructure for the EBM production contrast to the a’phase microstructure of the SLM product. Mesenchymal stem cell adhesion and differentiation were well. Conclusion: The property of the Ti-6Al-4V alloy manufactured by EBRM and SLM technique can meet the medical standard from this study. But some further study should be proceeded in order to applying well in clinical practice.Keywords: 3D printing, Electron Beam Rapid Manufacturing (EBRM), Selective Laser Melting (SLM), Computer Aided Design (CAD)
Procedia PDF Downloads 454