Search results for: piezo diaphragm based actuator
26218 Supply Air Pressure Control of HVAC System Using MPC Controller
Authors: P. Javid, A. Aeenmehr, J. Taghavifar
Abstract:
In this paper, supply air pressure of HVAC system has been modeled with second-order transfer function plus dead-time. In HVAC system, the desired input has step changes, and the output of proposed control system should be able to follow the input reference, so the idea of using model based predictive control is proceeded and designed in this paper. The closed loop control system is implemented in MATLAB software and the simulation results are provided. The simulation results show that the model based predictive control is able to control the plant properly.Keywords: air conditioning system, GPC, dead time, air supply control
Procedia PDF Downloads 52726217 The Effects of Nanoemulsions Based on Commercial Oils: Sunflower, Canola, Corn, Olive, Soybean, and Hazelnut Oils for the Quality of Farmed Sea Bass at 2±2°C
Authors: Yesim Ozogul, Mustafa Durmuş, Fatih Ozogul, Esmeray Kuley Boğa, Yılmaz Uçar, Hatice Yazgan
Abstract:
The effects of oil-in-water nanoemulsions on the sensory, chemical (total volatile basic nitrogen (TVB-N), thiobarbituric acid (TBA), peroxide value (PV) and free fatty acids (FFA), and microbiological qualities (total viable count (TVC), total psychrophilic bacteria, and total Enterbactericaea bacteria) of sea bream fillets stored at 2 ± 2°C were investigated. Physical properties of emulsions (viscosity, the particle size of droplet, thermodynamic stability, refractive index and surface tension) were determined. The results showed that the use of nanoemulsion extended the shelf life of fish 2 days when compared with the control. Treatment with nanoemulsions significantly (p<0.05) decreased the values of biochemical parameters during storage period. Bacterial growth was inhibited by the use of nanoemulsions. Based on the results, it can be concluded that nanoemulsions based on commercial oils extended the shelf life and improved the quality of sea bass fillets during storage period.Keywords: lipid oxidation, nanoemulsion, sea bass, quality parameters
Procedia PDF Downloads 47926216 Multifunctional Bismuth-Based Nanoparticles as Theranostic Agent for Imaging and Radiation Therapy
Authors: Azimeh Rajaee, Lingyun Zhao, Shi Wang, Yaqiang Liu
Abstract:
In recent years many studies have been focused on bismuth-based nanoparticles as radiosensitizer and contrast agent in radiation therapy and imaging due to the high atomic number (Z = 82), high photoelectric absorption, low cost, and low toxicity. This study aims to introduce a new multifunctional bismuth-based nanoparticle as a theranostic agent for radiotherapy, computed tomography (CT) and magnetic resonance imaging (MRI). We synthesized bismuth ferrite (BFO, BiFeO3) nanoparticles by sol-gel method and surface of the nanoparticles were modified by Polyethylene glycol (PEG). After proved biocompatibility of the nanoparticles, the ability of them as contract agent in Computed tomography (CT) and magnetic resonance imaging (MRI) was investigated. The relaxation time rate (R2) in MRI and Hounsfield unit (HU) in CT imaging were increased with the concentration of the nanoparticles. Moreover, the effect of nanoparticles on dose enhancement in low energy was investigated by clonogenic assay. According to clonogenic assay, sensitizer enhancement ratios (SERs) were obtained as 1.35 and 1.76 for nanoparticle concentrations of 0.05 mg/ml and 0.1 mg/ml, respectively. In conclusion, our experimental results demonstrate that the multifunctional nanoparticles have the ability to employ as multimodal imaging and therapy to enhance theranostic efficacy.Keywords: molecular imaging, nanomedicine, radiotherapy, theranostics
Procedia PDF Downloads 31726215 An Architecture Based on Capsule Networks for the Identification of Handwritten Signature Forgery
Authors: Luisa Mesquita Oliveira Ribeiro, Alexei Manso Correa Machado
Abstract:
Handwritten signature is a unique form for recognizing an individual, used to discern documents, carry out investigations in the criminal, legal, banking areas and other applications. Signature verification is based on large amounts of biometric data, as they are simple and easy to acquire, among other characteristics. Given this scenario, signature forgery is a worldwide recurring problem and fast and precise techniques are needed to prevent crimes of this nature from occurring. This article carried out a study on the efficiency of the Capsule Network in analyzing and recognizing signatures. The chosen architecture achieved an accuracy of 98.11% and 80.15% for the CEDAR and GPDS databases, respectively.Keywords: biometrics, deep learning, handwriting, signature forgery
Procedia PDF Downloads 8326214 Displacement Based Design of a Dual Structural System
Authors: Romel Cordova Shedan
Abstract:
The traditional seismic design is the methodology of Forced Based Design (FBD). The Displacement Based Design (DBD) is a seismic design that considers structural damage to achieve a failure mechanism of the structure before the collapse. It is easier to quantify damage of a structure with displacements rather than forces. Therefore, a structure to achieve an inelastic displacement design with good ductility, it is necessary to be damaged. The first part of this investigation is about differences between the methodologies of DBD and FBD with some DBD advantages. In the second part, there is a study case about a dual building 5-story, which is regular in plan and elevation. The building is located in a seismic zone, which acceleration in firm soil is 45% of the acceleration of gravity. Then it is applied both methodologies into the study case to compare its displacements, shear forces and overturning moments. In the third part, the Dynamic Time History Analysis (DTHA) is done, to compare displacements with DBD and FBD methodologies. Three accelerograms were used and the magnitude of the acceleration scaled to be spectrum compatible with design spectrum. Then, using ASCE 41-13 guidelines, the hinge plastics were assigned to structure. Finally, both methodologies results about study case are compared. It is important to take into account that the seismic performance level of the building for DBD is greater than FBD method. This is due to drifts of DBD are in the order of 2.0% and 2.5% comparing with FBD drifts of 0.7%. Therefore, displacements of DBD is greater than the FBD method. Shear forces of DBD result greater than FBD methodology. These strengths of DBD method ensures that structure achieves design inelastic displacements, because those strengths were obtained due to a displacement spectrum reduction factor which depends on damping and ductility of the dual system. Also, the displacements for the study case for DBD results to be greater than FBD and DTHA. In that way, it proves that the seismic performance level of the building for DBD is greater than FBD method. Due to drifts of DBD which are in the order of 2.0% and 2.5% compared with little FBD drifts of 0.7%.Keywords: displacement-based design, displacement spectrum reduction factor, dynamic time history analysis, forced based design
Procedia PDF Downloads 22926213 Evaluating the Location of Effective Product Advertising on Facebook Ads
Authors: Aulia F. Hadining, Atya Nur Aisha, Dimas Kurninatoro Aji
Abstract:
Utilization of social media as a marketing tool is growing rapidly, including for SMEs. Social media allows the user to give product evaluation and recommendations to the public. In addition, the social media facilitate word-of-mouth marketing communication. One of the social media that can be used is Facebook, with Facebook Ads. This study aimed to evaluate the location of Facebook Ads, to obtain an appropriate advertising design. There are three alternatives location consist of desktop, right-hand column and mobile. The effectiveness and efficiency of advertising will be measured based on advertising metrics such as reach, click, Cost per Click (CUC) and Unique Click-Through-Rate (UCTR). Facebook's Ads Manager was used for seven days, targeted by age (18-24), location (Bandung), language (Indonesia) and keywords. The result was 13,999 total reach, as well as 342 clicks. Based on the results of comparison using ANOVA, there was a significant difference for each placement location based on advertising metrics. Mobile location was chosen to be successful ads, because it produces the lowest CUC, amounting to Rp 691,- per click and 14% UCTR. Results of this study showed Facebook Ads was useful and cost-effective media to promote the product of SME, because it could be view by many people in the same time.Keywords: marketing communication, social media, Facebook Ads, mobile location
Procedia PDF Downloads 35426212 A Block World Problem Based Sudoku Solver
Authors: Luciana Abednego, Cecilia Nugraheni
Abstract:
There are many approaches proposed for solving Sudoku puzzles. One of them is by modelling the puzzles as block world problems. There have been three model for Sudoku solvers based on this approach. Each model expresses Sudoku solver as a parameterized multi agent systems. In this work, we propose a new model which is an improvement over the existing models. This paper presents the development of a Sudoku solver that implements all the proposed models. Some experiments have been conducted to determine the performance of each model.Keywords: Sudoku puzzle, Sudoku solver, block world problem, parameterized multi agent systems
Procedia PDF Downloads 34126211 Blind Data Hiding Technique Using Interpolation of Subsampled Images
Authors: Singara Singh Kasana, Pankaj Garg
Abstract:
In this paper, a blind data hiding technique based on interpolation of sub sampled versions of a cover image is proposed. Sub sampled image is taken as a reference image and an interpolated image is generated from this reference image. Then difference between original cover image and interpolated image is used to embed secret data. Comparisons with the existing interpolation based techniques show that proposed technique provides higher embedding capacity and better visual quality marked images. Moreover, the performance of the proposed technique is more stable for different images.Keywords: interpolation, image subsampling, PSNR, SIM
Procedia PDF Downloads 57826210 A Phenomenological Expression for Self-Attractive Energy of Singlelayer Graphene Sheets
Authors: Bingjie Wu, C. Q. Ru
Abstract:
The present work studies several reasonably expected candidate integral forms for self-attractive potential energy of a free monolayer graphene sheet. The admissibility of a specific integral form for ripple formation is verified, while all others most of the candidate integral forms are rejected based on the non-existence of stable periodic ripples. Based on the selected integral form of self-attractive potential energy, some mechanical behavior, including ripple formation and buckling, of a free monolayer grapheme sheet are discussed in detailsKeywords: graphene, monolayer, ripples, van der Waals energy
Procedia PDF Downloads 39226209 A Survey of Field Programmable Gate Array-Based Convolutional Neural Network Accelerators
Authors: Wei Zhang
Abstract:
With the rapid development of deep learning, neural network and deep learning algorithms play a significant role in various practical applications. Due to the high accuracy and good performance, Convolutional Neural Networks (CNNs) especially have become a research hot spot in the past few years. However, the size of the networks becomes increasingly large scale due to the demands of the practical applications, which poses a significant challenge to construct a high-performance implementation of deep learning neural networks. Meanwhile, many of these application scenarios also have strict requirements on the performance and low-power consumption of hardware devices. Therefore, it is particularly critical to choose a moderate computing platform for hardware acceleration of CNNs. This article aimed to survey the recent advance in Field Programmable Gate Array (FPGA)-based acceleration of CNNs. Various designs and implementations of the accelerator based on FPGA under different devices and network models are overviewed, and the versions of Graphic Processing Units (GPUs), Application Specific Integrated Circuits (ASICs) and Digital Signal Processors (DSPs) are compared to present our own critical analysis and comments. Finally, we give a discussion on different perspectives of these acceleration and optimization methods on FPGA platforms to further explore the opportunities and challenges for future research. More helpfully, we give a prospect for future development of the FPGA-based accelerator.Keywords: deep learning, field programmable gate array, FPGA, hardware accelerator, convolutional neural networks, CNN
Procedia PDF Downloads 12826208 Laboratory Scale Production of Bio-Based Chemicals from Industrial Waste Feedstock in South Africa
Authors: P. Mandree, S. O. Ramchuran, F. O'Brien, L. Sethunya, S. Khumalo
Abstract:
South Africa is identified as one of the five emerging waste management markets, globally. The waste sector in South Africa influences the areas of energy, water and food at an economic and social level. Recently, South African industries have focused on waste valorization and diversification of the current product offerings in an attempt to reduce industrial waste, target a zero waste-to-landfill initiative and recover energy. South Africa has a number of waste streams including industrial and agricultural biomass, municipal waste and marine waste. Large volumes of agricultural and forestry residues, in particular, are generated which provides significant opportunity for production of bio-based fuels and chemicals. This could directly impact development of a rural economy. One of the largest agricultural industries is the sugar industry, which contributes significantly to the country’s economy and job creation. However, the sugar industry is facing challenges due to fluctuations in sugar prices, increasing competition with low-cost global sugar producers, increasing energy and agricultural input costs, lower consumption and aging facilities. This study is aimed at technology development for the production of various bio-based chemicals using feedstock from the sugar refining process. Various indigenous bacteria and yeast species were assessed for the potential to produce platform chemicals in flask studies and at 30 L fermentation scale. Quantitative analysis of targeted bio-based chemicals was performed using either gas chromatography or high pressure liquid chromatography to assess production yields and techno-economics in order to compare performance to current commercial benchmark processes. The study also creates a decision platform for the research direction that is required for strain development using Industrial Synthetic Biology.Keywords: bio-based chemicals, biorefinery, industrial synthetic biology, waste valorization
Procedia PDF Downloads 11926207 Web and Android-Based Applications as a Breakthrough in Preventing Non-System Fault Disturbances Due to Work Errors in the Transmission Unit
Authors: Dhany Irvandy, Ary Gemayel, Mohammad Azhar, Leidenti Dwijayanti, Iif Hafifah
Abstract:
Work safety is among the most important things in work execution. Unsafe conditions and actions are priorities in accident prevention in the world of work, especially in the operation and maintenance of electric power transmission. Considering the scope of work, operational work in the transmission has a very high safety risk. Various efforts have been made to avoid work accidents. However, accidents or disturbances caused by non-conformities in work implementation still often occur. Unsafe conditions or actions can cause these. Along with the development of technology, website-based applications and mobile applications have been widely used as a medium to monitor work in real-time and by more people. This paper explains the use of web and android-based applications to monitor work and work processes in the field to prevent work accidents or non-system fault disturbances caused by non-conformity of work implementation with predetermined work instructions. Because every job is monitored in real-time, recorded in time and documented systemically, this application can reduce the occurrence of possible unsafe actions carried out by job executors that can cause disruption or work accidents.Keywords: work safety, unsafe action, application, non-system fault, real-time.
Procedia PDF Downloads 4426206 Survey of Hawke's Bay Tourism Based Businesses: Tsunami Understanding and Preparation
Authors: V. A. Ritchie
Abstract:
The loss of life and livelihood experienced after the magnitude 9.3 Sumatra earthquake and tsunami on 26 December 2004 and magnitude 9 earthquake and tsunami in northeastern Japan on 11 March 2011, has raised global awareness and brought tsunami phenomenology, nomenclature, and representation into sharp focus. At the same time, travel and tourism continue to increase, contributing around 1 in 11 jobs worldwide. This increase in tourism is especially true for coastal zones, placing pressure on decision-makers to downplay tsunami risks and at the same time provide adequate tsunami warning so that holidaymakers will feel confident enough to visit places of high tsunami risk. This study investigates how well tsunami preparedness messages are getting through for tourist-based businesses in Hawke’s Bay New Zealand, a region of frequent seismic activity and a high probability of experiencing a nearshore tsunami. The aim of this study is to investigate whether tourists based businesses are well informed about tsunamis, how well they understand that information and to what extent their clients are included in awareness raising and evacuation processes. In high-risk tsunami zones, such as Hawke’s Bay, tourism based businesses face competitive tension between short term business profitability and longer term reputational issues related to preventable loss of life from natural hazards, such as tsunamis. This study will address ways to accommodate culturally and linguistically relevant tourist awareness measures without discouraging tourists or being too costly to implement.Keywords: tsunami risk and response, travel and tourism, business preparedness, cross cultural knowledge transfer
Procedia PDF Downloads 15226205 Multiaxial Fatigue Analysis of a High Performance Nickel-Based Superalloy
Authors: P. Selva, B. Lorraina, J. Alexis, A. Seror, A. Longuet, C. Mary, F. Denard
Abstract:
Over the past four decades, the fatigue behavior of nickel-based alloys has been widely studied. However, in recent years, significant advances in the fabrication process leading to grain size reduction have been made in order to improve fatigue properties of aircraft turbine discs. Indeed, a change in particle size affects the initiation mode of fatigue cracks as well as the fatigue life of the material. The present study aims to investigate the fatigue behavior of a newly developed nickel-based superalloy under biaxial-planar loading. Low Cycle Fatigue (LCF) tests are performed at different stress ratios so as to study the influence of the multiaxial stress state on the fatigue life of the material. Full-field displacement and strain measurements as well as crack initiation detection are obtained using Digital Image Correlation (DIC) techniques. The aim of this presentation is first to provide an in-depth description of both the experimental set-up and protocol: the multiaxial testing machine, the specific design of the cruciform specimen and performances of the DIC code are introduced. Second, results for sixteen specimens related to different load ratios are presented. Crack detection, strain amplitude and number of cycles to crack initiation vs. triaxial stress ratio for each loading case are given. Third, from fractographic investigations by scanning electron microscopy it is found that the mechanism of fatigue crack initiation does not depend on the triaxial stress ratio and that most fatigue cracks initiate from subsurface carbides.Keywords: cruciform specimen, multiaxial fatigue, nickel-based superalloy
Procedia PDF Downloads 29526204 Grid Pattern Recognition and Suppression in Computed Radiographic Images
Authors: Igor Belykh
Abstract:
Anti-scatter grids used in radiographic imaging for the contrast enhancement leave specific artifacts. Those artifacts may be visible or may cause Moiré effect when a digital image is resized on a diagnostic monitor. In this paper, we propose an automated grid artifacts detection and suppression algorithm which is still an actual problem. Grid artifacts detection is based on statistical approach in spatial domain. Grid artifacts suppression is based on Kaiser bandstop filter transfer function design and application avoiding ringing artifacts. Experimental results are discussed and concluded with description of advantages over existing approaches.Keywords: grid, computed radiography, pattern recognition, image processing, filtering
Procedia PDF Downloads 28326203 A Long Range Wide Area Network-Based Smart Pest Monitoring System
Authors: Yun-Chung Yu, Yan-Wen Wang, Min-Sheng Liao, Joe-Air Jiang, Yuen-Chung Lee
Abstract:
This paper proposes to use a Long Range Wide Area Network (LoRaWAN) for a smart pest monitoring system which aims at the oriental fruit fly (Bactrocera dorsalis) to improve the communication efficiency of the system. The oriental fruit fly is one of the main pests in Southeast Asia and the Pacific Rim. Different smart pest monitoring systems based on the Internet of Things (IoT) architecture have been developed to solve problems of employing manual measurement. These systems often use Octopus II, a communication module following the 2.4GHz IEEE 802.15.4 ZigBee specification, as sensor nodes. The Octopus II is commonly used in low-power and short-distance communication. However, the energy consumption increase as the logical topology becomes more complicate to have enough coverage in the large area. By comparison, LoRaWAN follows the Low Power Wide Area Network (LPWAN) specification, which targets the key requirements of the IoT technology, such as secure bi-directional communication, mobility, and localization services. The LoRaWAN network has advantages of long range communication, high stability, and low energy consumption. The 433MHz LoRaWAN model has two superiorities over the 2.4GHz ZigBee model: greater diffraction and less interference. In this paper, The Octopus II module is replaced by a LoRa model to increase the coverage of the monitoring system, improve the communication performance, and prolong the network lifetime. The performance of the LoRa-based system is compared with a ZigBee-based system using three indexes: the packet receiving rate, delay time, and energy consumption, and the experiments are done in different settings (e.g. distances and environmental conditions). In the distance experiment, a pest monitoring system using the two communication specifications is deployed in an area with various obstacles, such as buildings and living creatures, and the performance of employing the two communication specifications is examined. The experiment results show that the packet receiving the rate of the LoRa-based system is 96% , which is much higher than that of the ZigBee system when the distance between any two modules is about 500m. These results indicate the capability of a LoRaWAN-based monitoring system in long range transmission and ensure the stability of the system.Keywords: LoRaWan, oriental fruit fly, IoT, Octopus II
Procedia PDF Downloads 35226202 Water Quality Management Based on Hydrodynamic Approach, Landuse, and Human Intervention in Wulan Delta Central Java Indonesia: Problems Identification and Review
Authors: Lintang Nur Fadlillah, Muh Aris Marfai, M. Widyastuti
Abstract:
Delta is dynamics area which is influenced by marine and river. Increasing human population in coastal area and the need of life exert pressure in delta that provides various resources. Wulan Delta is one of active Delta in Central Java, Indonesia. It has been experienced multiple pressures because of natural factors and human factors. In order to provide scientific solution and to analyze the main driving force in river delta, we collected several evidences based on news, papers, and publications related to Wulan Delta. This paper presents a review and problems identification in Wulan Delta, based on hydrodynamic approach, land use, and human activities which influenced water quality in the delta. A comprehensive overview is needed to address best policies under local communities and government. The analysis based on driving forces which affect delta estuary and river mouth. Natural factor in particular hydrodynamic influenced by tides, waves, runoff, and sediment transport. However, hydrodynamic affecting mixing process in river estuaries. The main problem is human intervention in land which is land use exchange leads to several problems such us decreasing water quality. Almost 90% of delta has been transformed into fish pond by local communities. Yet, they have not apply any water management to treat waste water before flush it to the sea and estuary. To understand the environmental condition, we need to assess water quality of river delta. The assessment based on land use as non-point source pollution. In Wulan Delta there are no industries. The land use in Wulan Delta consist of fish pond, settlement, and agriculture. The samples must represent the land use, to estimate which land use are most influence in river delta pollution. The hydrodynamic condition such as high tides and runoff must be considered, because it will affect the mixing process and water quality as well. To determine the samples site, we need to involve local community, in order to give insight into them. Furthermore, based on this review and problem identification, recommendations and strategies for water management are formulated.Keywords: delta, land use, water quality, management, hydrodynamics
Procedia PDF Downloads 25026201 Developing a Customizable Serious Game and Its Applicability in the Classroom
Authors: Anita Kéri
Abstract:
Recent developments in the field of education have led to a renewed interest in teaching methodologies and practices. Gamification is fast becoming a key instrument in the education of new generations and besides other methods, serious games have become the center of attention. Ready-built serious games are available for most higher education institutions to buy and implement. However, monetary restraints and the unalterable nature of the games might deter most higher education institutions from the application of these serious games. Therefore, there is a continuously growing need for a customizable serious game that has been developed based on a concrete need analysis and experts’ opinion. There has been little evidence so far of serious games that have been created based on relevant and current need analysis from higher education institution teachers, professional practitioners and students themselves. Therefore, the aim of this current paper is to analyze the needs of higher education institution educators with special emphasis on their needs, the applicability of serious games in their classrooms, and exploring options for the development of a customizable serious game framework. The paper undertakes to analyze workshop discussions on implementing serious games in education and propose a customizable serious game framework applicable in the education of the new generation. Research results show that the most important feature of a serious game is its customizability. The fact that practitioners are able to manage different scenarios and upload their own content to a game seems to be a key to the increasingly widespread application of serious games in the classroom.Keywords: education, gamification, game-based learning, serious games
Procedia PDF Downloads 15826200 Collaboration-Based Islamic Financial Services: Case Study of Islamic Fintech in Indonesia
Authors: Erika Takidah, Salina Kassim
Abstract:
Digital transformation has accelerated in the new millennium. It is reshaping the financial services industry from a traditional system to financial technology. Moreover, the number of financial inclusion rates in Indonesia is less than 60%. An innovative model needed to elucidate this national problem. On the other hand, the Islamic financial service industry and financial technology grow fast as a new aspire in economic development. An Islamic bank, takaful, Islamic microfinance, Islamic financial technology and Islamic social finance institution could collaborate to intensify the financial inclusion number in Indonesia. The primary motive of this paper is to examine the strategy of collaboration-based Islamic financial services to enhance financial inclusion in Indonesia, particularly facing the digital era. The fundamental findings for the main problems are the foundations and key ecosystems aspect involved in the development of collaboration-based Islamic financial services. By using the Interpretive Structural Model (ISM) approach, the core problems faced in the development of the models have lacked policy instruments guarding the collaboration-based Islamic financial services with fintech work process and availability of human resources for fintech. The core strategies or foundations that are needed in the framework of collaboration-based Islamic financial services are the ability to manage and analyze data in the big data era. For the aspects of the Ecosystem or actors involved in the development of this model, the important actor is government or regulator, educational institutions, and also existing industries (Islamic financial services). The outcome of the study designates that strategy collaboration of Islamic financial services institution supported by robust technology, a legal and regulatory commitment of the regulators and policymakers of the Islamic financial institutions, extensive public awareness of financial inclusion in Indonesia. The study limited itself to realize financial inclusion, particularly in Islamic finance development in Indonesia. The study will have an inference for the concerned professional bodies, regulators, policymakers, stakeholders, and practitioners of Islamic financial service institutions.Keywords: collaboration, financial inclusion, Islamic financial services, Islamic fintech
Procedia PDF Downloads 14226199 Methodologies for Crack Initiation in Welded Joints Applied to Inspection Planning
Authors: Guang Zou, Kian Banisoleiman, Arturo González
Abstract:
Crack initiation and propagation threatens structural integrity of welded joints and normally inspections are assigned based on crack propagation models. However, the approach based on crack propagation models may not be applicable for some high-quality welded joints, because the initial flaws in them may be so small that it may take long time for the flaws to develop into a detectable size. This raises a concern regarding the inspection planning of high-quality welded joins, as there is no generally acceptable approach for modeling the whole fatigue process that includes the crack initiation period. In order to address the issue, this paper reviews treatment methods for crack initiation period and initial crack size in crack propagation models applied to inspection planning. Generally, there are four approaches, by: 1) Neglecting the crack initiation period and fitting a probabilistic distribution for initial crack size based on statistical data; 2) Extrapolating the crack propagation stage to a very small fictitious initial crack size, so that the whole fatigue process can be modeled by crack propagation models; 3) Assuming a fixed detectable initial crack size and fitting a probabilistic distribution for crack initiation time based on specimen tests; and, 4) Modeling the crack initiation and propagation stage separately using small crack growth theories and Paris law or similar models. The conclusion is that in view of trade-off between accuracy and computation efforts, calibration of a small fictitious initial crack size to S-N curves is the most efficient approach.Keywords: crack initiation, fatigue reliability, inspection planning, welded joints
Procedia PDF Downloads 35326198 Implementation of Tissue Engineering Technique to Nursing of Unhealed Diabetic Foot Lesion
Authors: Basuki Supartono
Abstract:
Introduction: Diabetic wound risks limb amputation, and the healing remains challenging. Chronic Hyperglycemia caused the insufficient inflammatory response and impaired ability of the cells to regenerate. Tissue Engineering Technique is mandatory. Methods: Tissue engineering (TE)-based therapy Utilizing mononuclear cells, plasma rich platelets, and collagen applied on the damaged tissue Results: TE technique resulting in acceptable outcomes. The wound healed completely in 2 months. No adverse effects. No allergic reaction. No morbidity and mortality Discussion: TE-based therapy utilizing mononuclear cells, plasma rich platelets, and collagen are safe and comfortable to fix damaged tissues. These components stop the chronic inflammatory process and increase cells' ability for regeneration and restoration of damaged tissues. Both of these allow the wound to regenerate and heal. Conclusion: TE-based therapy is safe and effectively treats unhealed diabetic lesion.Keywords: diabetic foot lesion, tissue engineering technique, wound healing, stemcells
Procedia PDF Downloads 7926197 Kernel-Based Double Nearest Proportion Feature Extraction for Hyperspectral Image Classification
Authors: Hung-Sheng Lin, Cheng-Hsuan Li
Abstract:
Over the past few years, kernel-based algorithms have been widely used to extend some linear feature extraction methods such as principal component analysis (PCA), linear discriminate analysis (LDA), and nonparametric weighted feature extraction (NWFE) to their nonlinear versions, kernel principal component analysis (KPCA), generalized discriminate analysis (GDA), and kernel nonparametric weighted feature extraction (KNWFE), respectively. These nonlinear feature extraction methods can detect nonlinear directions with the largest nonlinear variance or the largest class separability based on the given kernel function. Moreover, they have been applied to improve the target detection or the image classification of hyperspectral images. The double nearest proportion feature extraction (DNP) can effectively reduce the overlap effect and have good performance in hyperspectral image classification. The DNP structure is an extension of the k-nearest neighbor technique. For each sample, there are two corresponding nearest proportions of samples, the self-class nearest proportion and the other-class nearest proportion. The term “nearest proportion” used here consider both the local information and other more global information. With these settings, the effect of the overlap between the sample distributions can be reduced. Usually, the maximum likelihood estimator and the related unbiased estimator are not ideal estimators in high dimensional inference problems, particularly in small data-size situation. Hence, an improved estimator by shrinkage estimation (regularization) is proposed. Based on the DNP structure, LDA is included as a special case. In this paper, the kernel method is applied to extend DNP to kernel-based DNP (KDNP). In addition to the advantages of DNP, KDNP surpasses DNP in the experimental results. According to the experiments on the real hyperspectral image data sets, the classification performance of KDNP is better than that of PCA, LDA, NWFE, and their kernel versions, KPCA, GDA, and KNWFE.Keywords: feature extraction, kernel method, double nearest proportion feature extraction, kernel double nearest feature extraction
Procedia PDF Downloads 34426196 Indigenous Children Doing Better through Mother Tongue Based Early Childhood Care and Development Center in Chittagong Hill Tracts, Bangladesh
Authors: Meherun Nahar
Abstract:
Background:The Chittagong Hill Tracts (CHT) is one of the most diverse regions in Bangladesh in terms of geography, ethnicity, culture and traditions of the people and home of thirteen indigenous ethnic people. In Bangladesh indigenous children aged 6-10 years remain out of school, and the majority of those who do enroll drop out before completing primary school. According to different study that the dropout rate of indigenous children is much higher than the estimated national rate, children dropping out especially in the early years of primary school. One of the most critical barriers for these children is that they do not understand the national language in the government pre-primary school. And also their school readiness and development become slower. In this situation, indigenous children excluded from the mainstream quality education. To address this issue Save the children in Bangladesh and other organizations are implementing community-based Mother Tongue-Based Multilingual Education program (MTBMLE) in the Chittagong Hill Tracts (CHT) for improving the enrolment rate in Government Primary Schools (GPS) reducing dropout rate as well as quality education. In connection with that Save the children conducted comparative research in Chittagong hill tracts on children readiness through Mother tongue-based and Non-mother tongue ECCD center. Objectives of the Study To assess Mother Language based ECCD centers and Non-Mother language based ECCD centers children’s school readiness and development. To assess the community perception over Mother Language based and Non-Mother Language based ECCD center. Methodology: The methodology of the study was FGD, KII, In-depth Interview and observation. Both qualitative and quantitative research methods were followed. The quantitative part has three components, School Readiness, Classroom observation and Headteacher interview and qualitative part followed FGD technique. Findings: The interviews with children under school readiness component showed that in general, Mother Language (ML) based ECCD children doing noticeably better in all four areas (Knowledge, numeracy, fine motor skill and communication) than their peers from Non-mother language based children. ML students seem to be far better skilled in concepts about print as most of them could identify cover and title of the book that was shown to them. They could also know from where to begin to read the book or could correctly point the letter that was read. A big difference was found in the area of identifying letters as 89.3% ML students of could identify letters correctly whereas for Non mother language 30% could do the same. The class room observation data shows that ML children are more active and remained engaged in the classroom than NML students. Also, teachers of ML appeared to have more engaged in explaining issues relating to general knowledge or leading children in rhyming/singing other than telling something from text books. The participants of FGDs were very enthusiastic on using mother language as medium of teaching in pre-schools. They opined that this initiative elates children to attend school and enables them to continue primary schooling without facing any language barrier.Keywords: Chittagong hill tracts, early childhood care and development (ECCD), indigenous, mother language
Procedia PDF Downloads 11726195 Effect of Cooking Process on the Antioxidant Activity of Different Variants of Tomato-Based Sofrito
Authors: Ana Beltran Sanahuja, A. Valdés García, Saray Lopez De Pablo Gallego, Maria Soledad Prats Moya
Abstract:
Tomato consumption has greatly increased worldwide in the last few years, mostly due to a growing demand for products like sofrito. In this sense, regular consumption of tomato-based products has been consistently associated with a reduction in the incidence of chronic degenerative diseases. The sofrito is a homemade tomato sauce typical of the Mediterranean area, which contains as main ingredients: tomato, onion, garlic and olive oil. There are also sofrito’s variations by adding other spices which bring at the same time not only color, flavor, smell and or aroma; they also provide medicinal properties, due to their antioxidant power. This protective effect has mainly been attributed to the predominant bioactive compounds present in sofrito, such as lycopene and other carotenoids as well as more than 40 different polyphenols. Regarding the cooking process, it is known that it can modify the properties and the availability of nutrients in sofrito; however, there is not enough information regarding this issue. For this reason, the aim of the present work is to evaluate the cooking effect on the antioxidant capacity of different variants of tomato-based sofrito combined with other spices, through the analysis of total phenols content (TPC) and to evaluate the antioxidant capacity by using the method of free radical 2,2-diphenyl-1-picrylhydrazyl (DPPH). Based on the results obtained, it can be confirmed that the basic sofrito composed of tomato, onion, garlic and olive oil and the sofrito with 1 g of rosemary added, are the ones with the highest content of phenols presenting greater antioxidant power than other industrial sofrito, and that of other variables of sofrito with added thyme or higher amounts of garlic. Moreover, it has been observed that in the elaboration of the tomato-based sofrito, it is possible to cook until 60 minutes, since the cooking process increases the bioavailability of the carotenoids when breaking the cell walls, which weakens the binding forces between the carotenoids and increases the levels of antioxidants present, confirmed both with the TPC and DPPH methods. It can be concluded that the cooking process of different variants of tomato-based sofrito, including spices, can improve the antioxidant capacity. The synergistic effects of different antioxidants may have a greater protective effect; increasing, also, the digestibility of proteins. In addition, the antioxidants help to deactivate the free radicals of diseases such as atherosclerosis, aging, immune suppression, cancer, and diabetes.Keywords: antioxidants, cooking process, phenols sofrito
Procedia PDF Downloads 14026194 Humeral Head and Scapula Detection in Proton Density Weighted Magnetic Resonance Images Using YOLOv8
Authors: Aysun Sezer
Abstract:
Magnetic Resonance Imaging (MRI) is one of the advanced diagnostic tools for evaluating shoulder pathologies. Proton Density (PD)-weighted MRI sequences prove highly effective in detecting edema. However, they are deficient in the anatomical identification of bones due to a trauma-induced decrease in signal-to-noise ratio and blur in the traumatized cortices. Computer-based diagnostic systems require precise segmentation, identification, and localization of anatomical regions in medical imagery. Deep learning-based object detection algorithms exhibit remarkable proficiency in real-time object identification and localization. In this study, the YOLOv8 model was employed to detect humeral head and scapular regions in 665 axial PD-weighted MR images. The YOLOv8 configuration achieved an overall success rate of 99.60% and 89.90% for detecting the humeral head and scapula, respectively, with an intersection over union (IoU) of 0.5. Our findings indicate a significant promise of employing YOLOv8-based detection for the humerus and scapula regions, particularly in the context of PD-weighted images affected by both noise and intensity inhomogeneity.Keywords: YOLOv8, object detection, humerus, scapula, IRM
Procedia PDF Downloads 6626193 Maximum Entropy Based Image Segmentation of Human Skin Lesion
Authors: Sheema Shuja Khattak, Gule Saman, Imran Khan, Abdus Salam
Abstract:
Image segmentation plays an important role in medical imaging applications. Therefore, accurate methods are needed for the successful segmentation of medical images for diagnosis and detection of various diseases. In this paper, we have used maximum entropy to achieve image segmentation. Maximum entropy has been calculated using Shannon, Renyi, and Tsallis entropies. This work has novelty based on the detection of skin lesion caused by the bite of a parasite called Sand Fly causing the disease is called Cutaneous Leishmaniasis.Keywords: shannon, maximum entropy, Renyi, Tsallis entropy
Procedia PDF Downloads 46326192 Pseudo Modal Operating Deflection Shape Based Estimation Technique of Mode Shape Using Time History Modal Assurance Criterion
Authors: Doyoung Kim, Hyo Seon Park
Abstract:
Studies of System Identification(SI) based on Structural Health Monitoring(SHM) have actively conducted for structural safety. Recently SI techniques have been rapidly developed with output-only SI paradigm for estimating modal parameters. The features of these output-only SI methods consist of Frequency Domain Decomposition(FDD) and Stochastic Subspace Identification(SSI) are using the algorithms based on orthogonal decomposition such as singular value decomposition(SVD). But the SVD leads to high level of computational complexity to estimate modal parameters. This paper proposes the technique to estimate mode shape with lower computational cost. This technique shows pseudo modal Operating Deflections Shape(ODS) through bandpass filter and suggests time history Modal Assurance Criterion(MAC). Finally, mode shape could be estimated from pseudo modal ODS and time history MAC. Analytical simulations of vibration measurement were performed and the results with mode shape and computation time between representative SI method and proposed method were compared.Keywords: modal assurance criterion, mode shape, operating deflection shape, system identification
Procedia PDF Downloads 41026191 Alternative Key Exchange Algorithm Based on Elliptic Curve Digital Signature Algorithm Certificate and Usage in Applications
Authors: A. Andreasyan, C. Connors
Abstract:
The Elliptic Curve Digital Signature algorithm-based X509v3 certificates are becoming more popular due to their short public and private key sizes. Moreover, these certificates can be stored in Internet of Things (IoT) devices, with limited resources, using less memory and transmitted in network security protocols, such as Internet Key Exchange (IKE), Transport Layer Security (TLS) and Secure Shell (SSH) with less bandwidth. The proposed method gives another advantage, in that it increases the performance of the above-mentioned protocols in terms of key exchange by saving one scalar multiplication operation.Keywords: cryptography, elliptic curve digital signature algorithm, key exchange, network security protocol
Procedia PDF Downloads 14626190 Feature Selection of Personal Authentication Based on EEG Signal for K-Means Cluster Analysis Using Silhouettes Score
Authors: Jianfeng Hu
Abstract:
Personal authentication based on electroencephalography (EEG) signals is one of the important field for the biometric technology. More and more researchers have used EEG signals as data source for biometric. However, there are some disadvantages for biometrics based on EEG signals. The proposed method employs entropy measures for feature extraction from EEG signals. Four type of entropies measures, sample entropy (SE), fuzzy entropy (FE), approximate entropy (AE) and spectral entropy (PE), were deployed as feature set. In a silhouettes calculation, the distance from each data point in a cluster to all another point within the same cluster and to all other data points in the closest cluster are determined. Thus silhouettes provide a measure of how well a data point was classified when it was assigned to a cluster and the separation between them. This feature renders silhouettes potentially well suited for assessing cluster quality in personal authentication methods. In this study, “silhouettes scores” was used for assessing the cluster quality of k-means clustering algorithm is well suited for comparing the performance of each EEG dataset. The main goals of this study are: (1) to represent each target as a tuple of multiple feature sets, (2) to assign a suitable measure to each feature set, (3) to combine different feature sets, (4) to determine the optimal feature weighting. Using precision/recall evaluations, the effectiveness of feature weighting in clustering was analyzed. EEG data from 22 subjects were collected. Results showed that: (1) It is possible to use fewer electrodes (3-4) for personal authentication. (2) There was the difference between each electrode for personal authentication (p<0.01). (3) There is no significant difference for authentication performance among feature sets (except feature PE). Conclusion: The combination of k-means clustering algorithm and silhouette approach proved to be an accurate method for personal authentication based on EEG signals.Keywords: personal authentication, K-mean clustering, electroencephalogram, EEG, silhouettes
Procedia PDF Downloads 28526189 Artificial Neural Network Based Approach for Estimation of Individual Vehicle Speed under Mixed Traffic Condition
Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh
Abstract:
Developing speed model is a challenging task particularly under mixed traffic condition where the traffic composition plays a significant role in determining vehicular speed. The present research has been conducted to model individual vehicular speed in the context of mixed traffic on an urban arterial. Traffic speed and volume data have been collected from three midblock arterial road sections in New Delhi. Using the field data, a volume based speed prediction model has been developed adopting the methodology of Artificial Neural Network (ANN). The model developed in this work is capable of estimating speed for individual vehicle category. Validation results show a great deal of agreement between the observed speeds and the predicted values by the model developed. Also, it has been observed that the ANN based model performs better compared to other existing models in terms of accuracy. Finally, the sensitivity analysis has been performed utilizing the model in order to examine the effects of traffic volume and its composition on individual speeds.Keywords: speed model, artificial neural network, arterial, mixed traffic
Procedia PDF Downloads 388