Search results for: soft computing techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8447

Search results for: soft computing techniques

6407 Development of Protein-based Emulsion Gels For Food Structuring

Authors: Baigts-Allende Diana, Klojdová Iveta, Kozlu Ali, Metri-ojeda Jorge

Abstract:

Emulsion gels are constituted by a colloidal system (emulsion) stabilized by a polymeric gel matrix. These systems are more homogeneous and stable than conventional emulsions and can behave as either gel-like or soft-solid. Protein-based emulsion gels (PEG) have been used as carrier systems of bioactive compounds and as food structuring to improve the texture and consistency, mainly in producing low-fat content products. This work studied the effect of protein: polysaccharide ratio 0.75:1.25, 1:1, and 1.25:0.75 (levels -1, 0, and +1) and pH values (2-9) on the stability of protein-based emulsion gels using soy protein isolate and sodium alginate. Protein emulsion capacity was enhaced with increased pH (6,7,8 and 9) compared to acid pH values. The smaller particle size for PEG was at pH 9 (~23µm); however, with increasing protein ratio (level +1), higher particle size was observed (~23µm). The same trend was observed for rheological measurements; the consistency index (K) increased at pH 9 for level -1 (1.17) in comparison to level +1 (0.45). The studied PEG showed good thermal stability at neutral and pH 9 (~98 %) for all biopolymer ratios. Optimal conditions in pH and biopolymer ratios were determined for PEG using soy protein and sodium alginate ingredients with potential use in elaborating stable systems for broad application in the food sector.

Keywords: emulsion gels, food structuring, biopolymers, food systems

Procedia PDF Downloads 74
6406 A Reinforcement Learning Based Method for Heating, Ventilation, and Air Conditioning Demand Response Optimization Considering Few-Shot Personalized Thermal Comfort

Authors: Xiaohua Zou, Yongxin Su

Abstract:

The reasonable operation of heating, ventilation, and air conditioning (HVAC) is of great significance in improving the security, stability, and economy of power system operation. However, the uncertainty of the operating environment, thermal comfort varies by users and rapid decision-making pose challenges for HVAC demand response optimization. In this regard, this paper proposes a reinforcement learning-based method for HVAC demand response optimization considering few-shot personalized thermal comfort (PTC). First, an HVAC DR optimization framework based on few-shot PTC model and DRL is designed, in which the output of few-shot PTC model is regarded as the input of DRL. Then, a few-shot PTC model that distinguishes between awake and asleep states is established, which has excellent engineering usability. Next, based on soft actor criticism, an HVAC DR optimization algorithm considering the user’s PTC is designed to deal with uncertainty and make decisions rapidly. Experiment results show that the proposed method can efficiently obtain use’s PTC temperature, reduce energy cost while ensuring user’s PTC, and achieve rapid decision-making under uncertainty.

Keywords: HVAC, few-shot personalized thermal comfort, deep reinforcement learning, demand response

Procedia PDF Downloads 86
6405 New Variational Approach for Contrast Enhancement of Color Image

Authors: Wanhyun Cho, Seongchae Seo, Soonja Kang

Abstract:

In this work, we propose a variational technique for image contrast enhancement which utilizes global and local information around each pixel. The energy functional is defined by a weighted linear combination of three terms which are called on a local, a global contrast term and dispersion term. The first one is a local contrast term that can lead to improve the contrast of an input image by increasing the grey-level differences between each pixel and its neighboring to utilize contextual information around each pixel. The second one is global contrast term, which can lead to enhance a contrast of image by minimizing the difference between its empirical distribution function and a cumulative distribution function to make the probability distribution of pixel values becoming a symmetric distribution about median. The third one is a dispersion term that controls the departure between new pixel value and pixel value of original image while preserving original image characteristics as well as possible. Second, we derive the Euler-Lagrange equation for true image that can achieve the minimum of a proposed functional by using the fundamental lemma for the calculus of variations. And, we considered the procedure that this equation can be solved by using a gradient decent method, which is one of the dynamic approximation techniques. Finally, by conducting various experiments, we can demonstrate that the proposed method can enhance the contrast of colour images better than existing techniques.

Keywords: color image, contrast enhancement technique, variational approach, Euler-Lagrang equation, dynamic approximation method, EME measure

Procedia PDF Downloads 450
6404 Contextual Toxicity Detection with Data Augmentation

Authors: Julia Ive, Lucia Specia

Abstract:

Understanding and detecting toxicity is an important problem to support safer human interactions online. Our work focuses on the important problem of contextual toxicity detection, where automated classifiers are tasked with determining whether a short textual segment (usually a sentence) is toxic within its conversational context. We use “toxicity” as an umbrella term to denote a number of variants commonly named in the literature, including hate, abuse, offence, among others. Detecting toxicity in context is a non-trivial problem and has been addressed by very few previous studies. These previous studies have analysed the influence of conversational context in human perception of toxicity in controlled experiments and concluded that humans rarely change their judgements in the presence of context. They have also evaluated contextual detection models based on state-of-the-art Deep Learning and Natural Language Processing (NLP) techniques. Counterintuitively, they reached the general conclusion that computational models tend to suffer performance degradation in the presence of context. We challenge these empirical observations by devising better contextual predictive models that also rely on NLP data augmentation techniques to create larger and better data. In our study, we start by further analysing the human perception of toxicity in conversational data (i.e., tweets), in the absence versus presence of context, in this case, previous tweets in the same conversational thread. We observed that the conclusions of previous work on human perception are mainly due to data issues: The contextual data available does not provide sufficient evidence that context is indeed important (even for humans). The data problem is common in current toxicity datasets: cases labelled as toxic are either obviously toxic (i.e., overt toxicity with swear, racist, etc. words), and thus context does is not needed for a decision, or are ambiguous, vague or unclear even in the presence of context; in addition, the data contains labeling inconsistencies. To address this problem, we propose to automatically generate contextual samples where toxicity is not obvious (i.e., covert cases) without context or where different contexts can lead to different toxicity judgements for the same tweet. We generate toxic and non-toxic utterances conditioned on the context or on target tweets using a range of techniques for controlled text generation(e.g., Generative Adversarial Networks and steering techniques). On the contextual detection models, we posit that their poor performance is due to limitations on both of the data they are trained on (same problems stated above) and the architectures they use, which are not able to leverage context in effective ways. To improve on that, we propose text classification architectures that take the hierarchy of conversational utterances into account. In experiments benchmarking ours against previous models on existing and automatically generated data, we show that both data and architectural choices are very important. Our model achieves substantial performance improvements as compared to the baselines that are non-contextual or contextual but agnostic of the conversation structure.

Keywords: contextual toxicity detection, data augmentation, hierarchical text classification models, natural language processing

Procedia PDF Downloads 170
6403 Road Traffic Accidents Analysis in Mexico City through Crowdsourcing Data and Data Mining Techniques

Authors: Gabriela V. Angeles Perez, Jose Castillejos Lopez, Araceli L. Reyes Cabello, Emilio Bravo Grajales, Adriana Perez Espinosa, Jose L. Quiroz Fabian

Abstract:

Road traffic accidents are among the principal causes of traffic congestion, causing human losses, damages to health and the environment, economic losses and material damages. Studies about traditional road traffic accidents in urban zones represents very high inversion of time and money, additionally, the result are not current. However, nowadays in many countries, the crowdsourced GPS based traffic and navigation apps have emerged as an important source of information to low cost to studies of road traffic accidents and urban congestion caused by them. In this article we identified the zones, roads and specific time in the CDMX in which the largest number of road traffic accidents are concentrated during 2016. We built a database compiling information obtained from the social network known as Waze. The methodology employed was Discovery of knowledge in the database (KDD) for the discovery of patterns in the accidents reports. Furthermore, using data mining techniques with the help of Weka. The selected algorithms was the Maximization of Expectations (EM) to obtain the number ideal of clusters for the data and k-means as a grouping method. Finally, the results were visualized with the Geographic Information System QGIS.

Keywords: data mining, k-means, road traffic accidents, Waze, Weka

Procedia PDF Downloads 418
6402 The Osteocutaneous Distal Tibia Turn-over Fillet Flap: A Novel Spare-parts Orthoplastic Surgery Option for Functional Below-knee Amputation

Authors: Harry Burton, Alexios Dimitrios Iliadis, Neil Jones, Aaron Saini, Nicola Bystrzonowski, Alexandros Vris, Georgios Pafitanis

Abstract:

This article portrays the authors’ experience with a complex lower limb bone and soft tissue defect, following chronic osteomyelitis and pathological fracture, which was managed by the multidisciplinary orthoplastic team. The decision for functional amputation versus limb salvage was deemed necessary, enhanced by the principles of “spares parts” in reconstructive microsurgery. This case describes a successful use of the osteocutaneous distal tibia turn-over fillet flap that allowed ‘lowering the level of the amputation’ from a through knee to the conventional level of a below-knee amputation to preserve the knee joint function. This case demonstrates the value of ‘spare-parts’ surgery principles and how these concepts refine complex orthoplastic approaches when limb salvage is not possible to enhance function. The osteocutaneous distal tibia turn-over fillet flap is a robust technique for modified BKA reconstructions that provides sufficient bone length to achieve a tough, sensate stump and functional knee joint.

Keywords: osteocutaneous flap, fillet flap, spare-parts surgery, Below knee amputation

Procedia PDF Downloads 166
6401 Fat-Tail Test of Regulatory DNA Sequences

Authors: Jian-Jun Shu

Abstract:

The statistical properties of CRMs are explored by estimating similar-word set occurrence distribution. It is observed that CRMs tend to have a fat-tail distribution for similar-word set occurrence. Thus, the fat-tail test with two fatness coefficients is proposed to distinguish CRMs from non-CRMs, especially from exons. For the first fatness coefficient, the separation accuracy between CRMs and exons is increased as compared with the existing content-based CRM prediction method – fluffy-tail test. For the second fatness coefficient, the computing time is reduced as compared with fluffy-tail test, making it very suitable for long sequences and large data-base analysis in the post-genome time. Moreover, these indexes may be used to predict the CRMs which have not yet been observed experimentally. This can serve as a valuable filtering process for experiment.

Keywords: statistical approach, transcription factor binding sites, cis-regulatory modules, DNA sequences

Procedia PDF Downloads 290
6400 Silymarin Loaded Mesoporous Silica Nanoparticles: Preparation, Optimization, Pharmacodynamic and Oral Multi-Dose Safety Assessment

Authors: Sarah Nasr, Maha M. A. Nasra, Ossama Y. Abdallah

Abstract:

The present work aimed to prepare Silymarin loaded MCM-41 type mesoporous silica nanoparticles (MSNs) and to assess the system’s solubility enhancement ability on the pharmacodynamic performance of Silymarin as a hepatoprotective agent. MSNs prepared by soft-templating technique, were loaded with Silymarin, characterized for particle size, zeta potential, surface properties, DSC and XRPD. DSC and specific surface area data confirmed deposition of Silymarin in an amorphous state in MSNs’ pores. In-vitro drug dissolution testing displayed enhanced dissolution rate of Silymarin upon loading on MSNs. High dose Acetaminophen was then used to inflict hepatic injury upon albino male Wistar rats simultaneously receiving either free Silymarin, Silymarin loaded MSNs or blank MSNs. Plasma AST, ALT, albumin and total protein and liver homogenate content of TBARs or LDH as measures of antioxidant drug action were assessed for all animal groups. Results showed a significant superiority of Silymarin loaded MSNs to free drug in almost all parameters. Meanwhile prolonged administration of blank MSNs had no evident toxicity on rats.

Keywords: mesoporous silica nanoparticles, safety, solubility enhancement, silymarin

Procedia PDF Downloads 332
6399 Exploration of Various Metrics for Partitioning of Cellular Automata Units for Efficient Reconfiguration of Field Programmable Gate Arrays (FPGAs)

Authors: Peter Tabatt, Christian Siemers

Abstract:

Using FPGA devices to improve the behavior of time-critical parts of embedded systems is a proven concept for years. With reconfigurable FPGA devices, the logical blocks can be partitioned and grouped into static and dynamic parts. The dynamic parts can be reloaded 'on demand' at runtime. This work uses cellular automata, which are constructed through compilation from (partially restricted) ANSI-C sources, to determine the suitability of various metrics for optimal partitioning. Significant metrics, in this case, are for example the area on the FPGA device for the partition, the pass count for loop constructs and communication characteristics to other partitions. With successful partitioning, it is possible to use smaller FPGA devices for the same requirements as with not reconfigurable FPGA devices or – vice versa – to use the same FPGAs for larger programs.

Keywords: reconfigurable FPGA, cellular automata, partitioning, metrics, parallel computing

Procedia PDF Downloads 272
6398 TQM Framework Using Notable Authors Comparative

Authors: Redha M. Elhuni

Abstract:

This paper presents an analysis of the essential characteristics of the TQM philosophy by comparing the work of five notable authors in the field. A framework is produced which gather the identified TQM enablers under the well-known operations management dimensions of process, business and people. These enablers are linked with sustainable development via balance scorecard type economic and non-economic measures. In order to capture a picture of Libyan Company’s efforts to implement the TQM, a questionnaire survey is designed and implemented. Results of the survey are presented showing the main differentiating factors between the sample companies, and a way of assessing the difference between the theoretical underpinning and the practitioners’ undertakings. Survey results indicate that companies are experiencing much difficulty in translating TQM theory into practice. Only a few companies have successfully adopted a holistic approach to TQM philosophy, and most of these put relatively high emphasis on hard elements compared with soft issues of TQM. However, where companies can realize the economic outputs, non- economic benefits such as workflow management, skills development and team learning are not realized. In addition, overall, non-economic measures have secured low weightings compared with the economic measures. We believe that the framework presented in this paper can help a company to concentrate its TQM implementation efforts in terms of process, system and people management dimensions.

Keywords: TQM, balance scorecard, EFQM excellence model, oil sector, Libya

Procedia PDF Downloads 406
6397 A Review: Detection and Classification Defects on Banana and Apples by Computer Vision

Authors: Zahow Muoftah

Abstract:

Traditional manual visual grading of fruits has been one of the agricultural industry’s major challenges due to its laborious nature as well as inconsistency in the inspection and classification process. The main requirements for computer vision and visual processing are some effective techniques for identifying defects and estimating defect areas. Automated defect detection using computer vision and machine learning has emerged as a promising area of research with a high and direct impact on the visual inspection domain. Grading, sorting, and disease detection are important factors in determining the quality of fruits after harvest. Many studies have used computer vision to evaluate the quality level of fruits during post-harvest. Many studies have used computer vision to evaluate the quality level of fruits during post-harvest. Many studies have been conducted to identify diseases and pests that affect the fruits of agricultural crops. However, most previous studies concentrated solely on the diagnosis of a lesion or disease. This study focused on a comprehensive study to identify pests and diseases of apple and banana fruits using detection and classification defects on Banana and Apples by Computer Vision. As a result, the current article includes research from these domains as well. Finally, various pattern recognition techniques for detecting apple and banana defects are discussed.

Keywords: computer vision, banana, apple, detection, classification

Procedia PDF Downloads 106
6396 Innovative Housing Construction Technologies in Slum Upgrading

Authors: Edmund M. Muthigani

Abstract:

Innovation in the construction industry has been characterized by new products and processes especially in slum upgrading. The need for low cost housing has motivated stakeholders to think outside the box in coming up with solutions. This paper explored innovative construction technologies that have been used in slum upgrading. The main objectives of the paper was to examine innovations in the construction housing sector and to show how incremental derived demand for decent housing has led to adoption of innovative technologies and materials. Systematic literature review was used to review studies on innovative construction technologies in slum upgrading. The review revealed slow process of innovations in the construction industry due to risk aversion by firms and the hesitance to adopt by firms and individuals. Low profit margins in low cost housing and lack of sufficient political support remain the major hurdles to innovative techniques adoption that can actualize right to decent housing. Conventional construction materials have remained unaffordable to many people and this has negated them decent housing. This has necessitated exploration of innovative materials to realize low cost housing. Stabilized soil blocks and sisal-cement roofing blocks are some of the innovative construction materials that have been utilized in slum upgrading. These innovative materials have not only lowered the cost of production of building elements but also eased costs of transport as the raw materials to produce them are readily available in or within the slum sites. Despite their shortcomings in durability and compressive strength, they have proved worthwhile in slum upgrading. Production of innovative construction materials and use of innovative techniques in slum upgrading also provided employment to the locals.

Keywords: construction, housing, innovation, slum, technology

Procedia PDF Downloads 208
6395 Investigating the Form of the Generalised Equations of Motion of the N-Bob Pendulum and Computing Their Solution Using MATLAB

Authors: Divij Gupta

Abstract:

Pendular systems have a range of both mathematical and engineering applications, ranging from modelling the behaviour of a continuous mass-density rope to utilisation as Tuned Mass Dampers (TMD). Thus, it is of interest to study the differential equations governing the motion of such systems. Here we attempt to generalise these equations of motion for the plane compound pendulum with a finite number of N point masses. A Lagrangian approach is taken, and we attempt to find the generalised form for the Euler-Lagrange equations of motion for the i-th bob of the N -bob pendulum. The co-ordinates are parameterized as angular quantities to reduce the number of degrees of freedom from 2N to N to simplify the form of the equations. We analyse the form of these equations up to N = 4 to determine the general form of the equation. We also develop a MATLAB program to compute a solution to the system for a given input value of N and a given set of initial conditions.

Keywords: classical mechanics, differential equation, lagrangian analysis, pendulum

Procedia PDF Downloads 209
6394 Making of Alloy Steel by Direct Alloying with Mineral Oxides during Electro-Slag Remelting

Authors: Vishwas Goel, Kapil Surve, Somnath Basu

Abstract:

In-situ alloying in steel during the electro-slag remelting (ESR) process has already been achieved by the addition of necessary ferroalloys into the electro-slag remelting mold. However, the use of commercially available ferroalloys during ESR processing is often found to be financially less favorable, in comparison with the conventional alloying techniques. However, a process of alloying steel with elements like chromium and manganese using the electro-slag remelting route is under development without any ferrochrome addition. The process utilizes in-situ reduction of refined mineral chromite (Cr₂O₃) and resultant enrichment of chromium in the steel ingot produced. It was established in course of this work that this process can become more advantageous over conventional alloying techniques, both economically and environmentally, for applications which inherently demand the use of the electro-slag remelting process, such as manufacturing of superalloys. A key advantage is the lower overall CO₂ footprint of this process relative to the conventional route of production, storage, and the addition of ferrochrome. In addition to experimentally validating the feasibility of the envisaged reactions, a mathematical model to simulate the reduction of chromium (III) oxide and transfer to chromium to the molten steel droplets was also developed as part of the current work. The developed model helps to correlate the amount of chromite input and the magnitude of chromium alloying that can be achieved through this process. Experiments are in progress to validate the predictions made by this model and to fine-tune its parameters.

Keywords: alloying element, chromite, electro-slag remelting, ferrochrome

Procedia PDF Downloads 223
6393 Optimum Design of Steel Space Frames by Hybrid Teaching-Learning Based Optimization and Harmony Search Algorithms

Authors: Alper Akin, Ibrahim Aydogdu

Abstract:

This study presents a hybrid metaheuristic algorithm to obtain optimum designs for steel space buildings. The optimum design problem of three-dimensional steel frames is mathematically formulated according to provisions of LRFD-AISC (Load and Resistance factor design of American Institute of Steel Construction). Design constraints such as the strength requirements of structural members, the displacement limitations, the inter-story drift and the other structural constraints are derived from LRFD-AISC specification. In this study, a hybrid algorithm by using teaching-learning based optimization (TLBO) and harmony search (HS) algorithms is employed to solve the stated optimum design problem. These algorithms are two of the recent additions to metaheuristic techniques of numerical optimization and have been an efficient tool for solving discrete programming problems. Using these two algorithms in collaboration creates a more powerful tool and mitigates each other’s weaknesses. To demonstrate the powerful performance of presented hybrid algorithm, the optimum design of a large scale steel building is presented and the results are compared to the previously obtained results available in the literature.

Keywords: optimum structural design, hybrid techniques, teaching-learning based optimization, harmony search algorithm, minimum weight, steel space frame

Procedia PDF Downloads 545
6392 Rhythm-Reading Success Using Conversational Solfege

Authors: Kelly Jo Hollingsworth

Abstract:

Conversational Solfege, a research-based, 12-step music literacy instructional method using the sound-before-sight approach, was used to teach rhythm-reading to 128-second grade students at a public school in the southeastern United States. For each step, multiple scripted techniques are supplied to teach each skill. Unit one was the focus of this study, which is quarter note and barred eighth note rhythms. During regular weekly music instruction, students completed method steps one through five, which includes aural discrimination, decoding familiar and unfamiliar rhythm patterns, and improvising rhythmic phrases using quarter notes and barred eighth notes. Intact classes were randomly assigned to two treatment groups for teaching steps six through eight, which was the visual presentation and identification of quarter notes and barred eighth notes, visually presenting and decoding familiar patterns, and visually presenting and decoding unfamiliar patterns using said notation. For three weeks, students practiced steps six through eight during regular weekly music class. One group spent five-minutes of class time on steps six through eight technique work, while the other group spends ten-minutes of class time practicing the same techniques. A pretest and posttest were administered, and ANOVA results reveal both the five-minute (p < .001) and ten-minute group (p < .001) reached statistical significance suggesting Conversational Solfege is an efficient, effective approach to teach rhythm-reading to second grade students. After two weeks of no instruction, students were retested to measure retention. Using a repeated-measures ANOVA, both groups reached statistical significance (p < .001) on the second posttest, suggesting both the five-minute and ten-minute group retained rhythm-reading skill after two weeks of no instruction. Statistical significance was not reached between groups (p=.252), suggesting five-minutes is equally as effective as ten-minutes of rhythm-reading practice using Conversational Solfege techniques. Future research includes replicating the study with other grades and units in the text.

Keywords: conversational solfege, length of instructional time, rhythm-reading, rhythm instruction

Procedia PDF Downloads 157
6391 Cytotoxic Effect of Crude Extract of Sea Pen Virgularia gustaviana on HeLa and MDA-MB-231 Cancer Cell Lines

Authors: Sharareh Sharifi, Pargol Ghavam Mostafavi, Ali Mashinchian Moradi, Mohammad Hadi Givianrad, Hassan Niknejad

Abstract:

Marine organisms such as soft coral, sponge, ascidians, and tunicate containing rich source of natural compound have been studied in last decades because of their special chemical compounds with anticancer properties. The aim of this study was to investigate anti-cancer property of ethyl acetate extracted from marine sea pen Virgularia gustaviana found from Persian Gulf coastal (Bandar Abbas). The extraction processes were carried out with ethyl acetate for five days. Thin layer chromatography (TLC) and high-performance liquid chromatography (HPLC) were used for qualitative identification of crude extract. The viability of HeLa and MDA-Mb-231 cancer cells was investigated using MTT assay at the concentration of 25, 50, and a 100 µl/ml of ethyl acetate is extracted. The crude extract of Virgularia gustaviana demonstrated ten fractions with different Retention factor (Rf) by TLC and Retention time (Rt) evaluated by HPLC. The crude extract dose-dependently decreased cancer cell viability compared to control group. According to the results, the ethyl acetate extracted from Virgularia gustaviana inhibits the growth of cancer cells, an effect which needs to be further investigated in the future studies.

Keywords: anti-cancer, Hela cancer cell, MDA-Md-231 cancer cell, Virgularia gustavina

Procedia PDF Downloads 431
6390 Low-Cost Fog Edge Computing for Smart Power Management and Home Automation

Authors: Belkacem Benadda, Adil Benabdellah, Boutheyna Souna

Abstract:

The Internet of Things (IoT) is an unprecedented creation. Electronics objects are now able to interact, share, respond and adapt to their environment on a much larger basis. Actual spread of these modern means of connectivity and solutions with high data volume exchange are affecting our ways of life. Accommodation is becoming an intelligent living space, not only suited to the people circumstances and desires, but also to systems constraints to make daily life simpler, cheaper, increase possibilities and achieve a higher level of services and luxury. In this paper we are as Internet access, teleworking, consumption monitoring, information search, etc.). This paper addresses the design and integration of a smart home, it also purposes an IoT solution that allows smart power consumption based on measurements from power-grid and deep learning analysis.

Keywords: array sensors, IoT, power grid, FPGA, embedded

Procedia PDF Downloads 116
6389 Geomatic Techniques to Filter Vegetation from Point Clouds

Authors: M. Amparo Núñez-Andrés, Felipe Buill, Albert Prades

Abstract:

More and more frequently, geomatics techniques such as terrestrial laser scanning or digital photogrammetry, either terrestrial or from drones, are being used to obtain digital terrain models (DTM) used for the monitoring of geological phenomena that cause natural disasters, such as landslides, rockfalls, debris-flow. One of the main multitemporal analyses developed from these models is the quantification of volume changes in the slopes and hillsides, either caused by erosion, fall, or land movement in the source area or sedimentation in the deposition zone. To carry out this task, it is necessary to filter the point clouds of all those elements that do not belong to the slopes. Among these elements, vegetation stands out as it is the one we find with the greatest presence and its constant change, both seasonal and daily, as it is affected by factors such as wind. One of the best-known indexes to detect vegetation on the image is the NVDI (Normalized Difference Vegetation Index), which is obtained from the combination of the infrared and red channels. Therefore it is necessary to have a multispectral camera. These cameras are generally of lower resolution than conventional RGB cameras, while their cost is much higher. Therefore we have to look for alternative indices based on RGB. In this communication, we present the results obtained in Georisk project (PID2019‐103974RB‐I00/MCIN/AEI/10.13039/501100011033) by using the GLI (Green Leaf Index) and ExG (Excessive Greenness), as well as the change to the Hue-Saturation-Value (HSV) color space being the H coordinate the one that gives us the most information for vegetation filtering. These filters are applied both to the images, creating binary masks to be used when applying the SfM algorithms, and to the point cloud obtained directly by the photogrammetric process without any previous filter or the one obtained by TLS (Terrestrial Laser Scanning). In this last case, we have also tried to work with a Riegl VZ400i sensor that allows the reception, as in the aerial LiDAR, of several returns of the signal. Information to be used for the classification on the point cloud. After applying all the techniques in different locations, the results show that the color-based filters allow correct filtering in those areas where the presence of shadows is not excessive and there is a contrast between the color of the slope lithology and the vegetation. As we have advanced in the case of using the HSV color space, it is the H coordinate that responds best for this filtering. Finally, the use of the various returns of the TLS signal allows filtering with some limitations.

Keywords: RGB index, TLS, photogrammetry, multispectral camera, point cloud

Procedia PDF Downloads 154
6388 Analytical Study and Conservation Processes of Scribe Box from Old Kingdom

Authors: Mohamed Moustafa, Medhat Abdallah, Ramy Magdy, Ahmed Abdrabou, Mohamed Badr

Abstract:

The scribe box under study dates back to the old kingdom. It was excavated by the Italian expedition in Qena (1935-1937). The box consists of 2pieces, the lid and the body. The inner side of the lid is decorated with ancient Egyptian inscriptions written with a black pigment. The box was made using several panels assembled together by wooden dowels and secured with plant ropes. The entire box is covered with a red pigment. This study aims to use analytical techniques in order to identify and have deep understanding for the box components. Moreover, the authors were significantly interested in using infrared reflectance transmission imaging (RTI-IR) to improve the hidden inscriptions on the lid. The identification of wood species included in this study. The visual observation and assessment were done to understand the condition of this box. 3Ddimensions and 2D programs were used to illustrate wood joints techniques. Optical microscopy (OM), X-ray diffraction (XRD), X-ray fluorescence portable (XRF) and Fourier Transform Infrared spectroscopy (FTIR) were used in this study in order to identify wood species, remains of insects bodies, red pigment, fibers plant and previous conservation adhesives, also RTI-IR technique was very effective to improve hidden inscriptions. The analysis results proved that wooden panels and dowels were identified as Acacia nilotica, wooden rail was Salix sp. the insects were identified as Lasioderma serricorne and Gibbium psylloids, the red pigment was Hematite, while the fiber plants were linen, previous adhesive was identified as cellulose nitrates. The historical study for the inscriptions proved that it’s a Hieratic writings of a funerary Text. After its transportation from the Egyptian museum storage to the wood conservation laboratory of the Grand Egyptian museum –conservation center (GEM-CC), conservation techniques were applied with high accuracy in order to restore the object including cleaning , consolidating of friable pigments and writings, removal of previous adhesive and reassembly, finally the conservation process that were applied were extremely effective for this box which became ready for display or storage in the grand Egyptian museum.

Keywords: scribe box, hieratic, 3D program, Acacia nilotica, XRD, cellulose nitrate, conservation

Procedia PDF Downloads 271
6387 Wearable Music: Generation of Costumes from Music and Generative Art and Wearing Them by 3-Way Projectors

Authors: Noriki Amano

Abstract:

The final goal of this study is to create another way in which people enjoy music through the performance of 'Wearable Music'. Concretely speaking, we generate colorful costumes in real- time from music and to realize their dressing by projecting them to a person. For this purpose, we propose three methods in this study. First, a method of giving color to music in a three-dimensionally way. Second, a method of generating images of costumes from music. Third, a method of wearing the images of music. In particular, this study stands out from other related work in that we generate images of unique costumes from music and realize to wear them. In this study, we use the technique of generative arts to generate images of unique costumes and project the images to the fog generated around a person from 3-way using projectors. From this study, we can get how to enjoy music as 'wearable'. Furthermore, we are also able to have the prospect of unconventional entertainment based on the fusion between music and costumes.

Keywords: entertainment computing, costumes, music, generative programming

Procedia PDF Downloads 173
6386 DAG Design and Tradeoff for Full Live Virtual Machine Migration over XIA Network

Authors: Dalu Zhang, Xiang Jin, Dejiang Zhou, Jianpeng Wang, Haiying Jiang

Abstract:

Traditional TCP/IP network is showing lots of shortages and research for future networks is becoming a hotspot. FIA (Future Internet Architecture) and FIA-NP (Next Phase) are supported by US NSF for future Internet designing. Moreover, virtual machine migration is a significant technique in cloud computing. As a network application, it should also be supported in XIA (expressive Internet Architecture), which is in both FIA and FIA-NP projects. This paper is an experimental study aims at verifying the feasibility of VM migration over XIA. We present three ways to maintain VM connectivity and communication states concerning DAG design and routing table modification. VM migration experiments are conducted intra-AD and inter-AD with KVM instances. The procedure is achieved by a migration control protocol which is suitable for the characters of XIA. Evaluation results show that our solutions can well supports full live VM migration over XIA network respectively, keeping services seamless.

Keywords: DAG, downtime, virtual machine migration, XIA

Procedia PDF Downloads 855
6385 A Dynamic Solution Approach for Heart Disease Prediction

Authors: Walid Moudani

Abstract:

The healthcare environment is generally perceived as being information rich yet knowledge poor. However, there is a lack of effective analysis tools to discover hidden relationships and trends in data. In fact, valuable knowledge can be discovered from application of data mining techniques in healthcare system. In this study, a proficient methodology for the extraction of significant patterns from the coronary heart disease warehouses for heart attack prediction, which unfortunately continues to be a leading cause of mortality in the whole world, has been presented. For this purpose, we propose to enumerate dynamically the optimal subsets of the reduced features of high interest by using rough sets technique associated to dynamic programming. Therefore, we propose to validate the classification using Random Forest (RF) decision tree to identify the risky heart disease cases. This work is based on a large amount of data collected from several clinical institutions based on the medical profile of patient. Moreover, the experts’ knowledge in this field has been taken into consideration in order to define the disease, its risk factors, and to establish significant knowledge relationships among the medical factors. A computer-aided system is developed for this purpose based on a population of 525 adults. The performance of the proposed model is analyzed and evaluated based on set of benchmark techniques applied in this classification problem.

Keywords: multi-classifier decisions tree, features reduction, dynamic programming, rough sets

Procedia PDF Downloads 410
6384 SciPaaS: a Scientific Execution Platform for the Cloud

Authors: Wesley H. Brewer, John C. Sanford

Abstract:

SciPaaS is a prototype development of an execution platform/middleware designed to make it easy for scientists to rapidly deploy their scientific applications (apps) to the cloud. It provides all the necessary infrastructure for running typical IXP (Input-eXecute-Plot) style apps, including: a web interface, post-processing and plotting capabilities, job scheduling, real-time monitoring of running jobs, and even a file/case manager. In this paper, first the system architecture is described and then is demonstrated for a two scientific applications: (1) a simple finite-difference solver of the inviscid Burger’s equation, and (2) Mendel’s Accountant—a forward-time population genetics simulation model. The implications of the prototype are discussed in terms of ease-of-use and deployment options, especially in cloud environments.

Keywords: web-based simulation, cloud computing, Platform-as-a-Service (PaaS), rapid application development (RAD), population genetics

Procedia PDF Downloads 590
6383 Autism Disease Detection Using Transfer Learning Techniques: Performance Comparison between Central Processing Unit vs. Graphics Processing Unit Functions for Neural Networks

Authors: Mst Shapna Akter, Hossain Shahriar

Abstract:

Neural network approaches are machine learning methods used in many domains, such as healthcare and cyber security. Neural networks are mostly known for dealing with image datasets. While training with the images, several fundamental mathematical operations are carried out in the Neural Network. The operation includes a number of algebraic and mathematical functions, including derivative, convolution, and matrix inversion and transposition. Such operations require higher processing power than is typically needed for computer usage. Central Processing Unit (CPU) is not appropriate for a large image size of the dataset as it is built with serial processing. While Graphics Processing Unit (GPU) has parallel processing capabilities and, therefore, has higher speed. This paper uses advanced Neural Network techniques such as VGG16, Resnet50, Densenet, Inceptionv3, Xception, Mobilenet, XGBOOST-VGG16, and our proposed models to compare CPU and GPU resources. A system for classifying autism disease using face images of an autistic and non-autistic child was used to compare performance during testing. We used evaluation matrices such as Accuracy, F1 score, Precision, Recall, and Execution time. It has been observed that GPU runs faster than the CPU in all tests performed. Moreover, the performance of the Neural Network models in terms of accuracy increases on GPU compared to CPU.

Keywords: autism disease, neural network, CPU, GPU, transfer learning

Procedia PDF Downloads 118
6382 Effect of Plasma Treatment on UV Protection Properties of Fabrics

Authors: Sheila Shahidi

Abstract:

UV protection by fabrics has recently become a focus of great interest, particularly in connection with environmental degradation or ozone layer depletion. Fabrics provide simple and convenient protection against UV radiation (UVR), but not all fabrics offer sufficient UV protection. To describe the degree of UVR protection offered by clothing materials, the ultraviolet protection factor (UPF) is commonly used. UV-protective fabric can be generated by application of a chemical finish using normal wet-processing methodologies. However, traditional wet-processing techniques are known to consume large quantities of water and energy and may lead to adverse alterations of the bulk properties of the substrate. Recently, usage of plasmas to generate physicochemical surface modifications of textile substrates has become an intriguing approach to replace or enhance conventional wet-processing techniques. In this research work the effect of plasma treatment on UV protection properties of fabrics was investigated. DC magnetron sputtering was used and the parameters of plasma such as gas type, electrodes, time of exposure, power and, etc. were studied. The morphological and chemical properties of samples were analyzed using Scanning Electron Microscope (SEM) and Furrier Transform Infrared Spectroscopy (FTIR), respectively. The transmittance and UPF values of the original and plasma-treated samples were measured using a Shimadzu UV3101 PC (UV–Vis–NIR scanning spectrophotometer, 190–2, 100 nm range). It was concluded that, plasma which is an echo-friendly, cost effective and dry technique is being used in different branches of the industries, and will conquer textile industry in the near future. Also it is promising method for preparation of UV protection textile.

Keywords: fabric, plasma, textile, UV protection

Procedia PDF Downloads 520
6381 Performance Analysis and Optimization for Diagonal Sparse Matrix-Vector Multiplication on Machine Learning Unit

Authors: Qiuyu Dai, Haochong Zhang, Xiangrong Liu

Abstract:

Diagonal sparse matrix-vector multiplication is a well-studied topic in the fields of scientific computing and big data processing. However, when diagonal sparse matrices are stored in DIA format, there can be a significant number of padded zero elements and scattered points, which can lead to a degradation in the performance of the current DIA kernel. This can also lead to excessive consumption of computational and memory resources. In order to address these issues, the authors propose the DIA-Adaptive scheme and its kernel, which leverages the parallel instruction sets on MLU. The researchers analyze the effect of allocating a varying number of threads, clusters, and hardware architectures on the performance of SpMV using different formats. The experimental results indicate that the proposed DIA-Adaptive scheme performs well and offers excellent parallelism.

Keywords: adaptive method, DIA, diagonal sparse matrices, MLU, sparse matrix-vector multiplication

Procedia PDF Downloads 136
6380 Numerical Simulation and Experimental Verification of Mechanical Displacements in Piezoelectric Transformer

Authors: F. Boukazouha, G. Poulin-Vittrant, M. Rguiti, M. Lethiecq

Abstract:

Since its invention, by virtue of its remarkable features, the piezoelectric transformer (PT) has drawn the attention of the scientific community. In past years, it has been extensively studied and its performances have been continuously improved. Nowadays, such devices are designed in more and more sophisticated architectures with associated models describing their behavior quite accurately. However, the different studies usually carried out on such devices mainly focus on their electrical characteristics induced by direct piezoelectric effects such as voltage gain, efficiency or supplied power. In this work, we are particularly interested in the characterization of mechanical displacements induced by the inverse piezoelectric effect in a PT in vibration. For this purpose, a detailed three-dimensional finite element analysis is proposed to examine the mechanical behavior of a Rosen-type transformer made of a single bar of soft PZT (P191) and with dimensions 22mm×2.35mm×2.5mm. At the first three modes of vibration, output voltage and mechanical displacements ux, uy and uz along the length, the width and the thickness, respectively, are calculated. The amplitude of displacements varies in a range from a few nanometers to a few hundred nanometers. The validity of the simulations was successfully confirmed by experiments carried out on a prototype using a laser interferometer. A good match was observed between simulation and experimental results, especially for us at the second mode. Such 3D simulations thus appear as a helpful tool for a better understanding of mechanical phenomena in Rosen-type PT.

Keywords: piezoelectricity, gain, dispalcement, simulations

Procedia PDF Downloads 33
6379 A Conceptual Framework of Digital Twin for Homecare

Authors: Raja Omman Zafar, Yves Rybarczyk, Johan Borg

Abstract:

This article proposes a conceptual framework for the application of digital twin technology in home care. The main goal is to bridge the gap between advanced digital twin concepts and their practical implementation in home care. This study uses a literature review and thematic analysis approach to synthesize existing knowledge and proposes a structured framework suitable for homecare applications. The proposed framework integrates key components such as IoT sensors, data-driven models, cloud computing, and user interface design, highlighting the importance of personalized and predictive homecare solutions. This framework can significantly improve the efficiency, accuracy, and reliability of homecare services. It paves the way for the implementation of digital twins in home care, promoting real-time monitoring, early intervention, and better outcomes.

Keywords: digital twin, homecare, older adults, healthcare, IoT, artificial intelligence

Procedia PDF Downloads 72
6378 Dynamic Modeling of the Exchange Rate in Tunisia: Theoretical and Empirical Study

Authors: Chokri Slim

Abstract:

The relative failure of simultaneous equation models in the seventies has led researchers to turn to other approaches that take into account the dynamics of economic and financial systems. In this paper, we use an approach based on vector autoregressive model that is widely used in recent years. Their popularity is due to their flexible nature and ease of use to produce models with useful descriptive characteristics. It is also easy to use them to test economic hypotheses. The standard econometric techniques assume that the series studied are stable over time (stationary hypothesis). Most economic series do not verify this hypothesis, which assumes, when one wishes to study the relationships that bind them to implement specific techniques. This is cointegration which characterizes non-stationary series (integrated) with a linear combination is stationary, will also be presented in this paper. Since the work of Johansen, this approach is generally presented as part of a multivariate analysis and to specify long-term stable relationships while at the same time analyzing the short-term dynamics of the variables considered. In the empirical part, we have applied these concepts to study the dynamics of of the exchange rate in Tunisia, which is one of the most important economic policy of a country open to the outside. According to the results of the empirical study by the cointegration method, there is a cointegration relationship between the exchange rate and its determinants. This relationship shows that the variables have a significant influence in determining the exchange rate in Tunisia.

Keywords: stationarity, cointegration, dynamic models, causality, VECM models

Procedia PDF Downloads 365