Search results for: information extraction evaluation method
31090 Influence of the Cooking Technique on the Iodine Content of Frozen Hake
Authors: F. Deng, R. Sanchez, A. Beltran, S. Maestre
Abstract:
The high nutritional value associated with seafood is related to the presence of essential trace elements. Moreover, seafood is considered an important source of energy, proteins, and long-chain polyunsaturated fatty acids. Generally, seafood is consumed cooked. Consequently, the nutritional value could be degraded. Seafood, such as fish, shellfish, and seaweed, could be considered as one of the main iodine sources. The deficient or excessive consumption of iodine could cause dysfunction and pathologies related to the thyroid gland. The main objective of this work is to evaluated iodine stability in hake (Merluccius) undergone different culinary techniques. The culinary process considered were: boiling, steaming, microwave cooking, baking, cooking en papillote (twisted cover with the shape of a sweet wrapper) and coating with a batter of flour and deep-frying. The determination of iodine was carried by Inductively Coupled Plasma Mass Spectrometry (ICP-MS). Regarding sample handling strategies, liquid-liquid extraction has demonstrated to be a powerful pre-concentration and clean-up approach for trace metal analysis by ICP techniques. Extraction with tetramethylammonium hydroxide (TMAH reagent) was used as a sample preparation method in this work. Based on the results, it can be concluded that the stability of iodine was degraded with the cooking processes. The major degradation was observed for the boiling and microwave cooking processes. The content of iodine in hake decreased up to 60% and 52%, respectively. However, if the boiling cooking liquid is preserved, this loss that has been generated during cooking is reduced. Only when the fish was cooked by following the cooking en papillote process the iodine content was preserved.Keywords: cooking process, ICP-MS, iodine, hake
Procedia PDF Downloads 14131089 Valorization of Seafood and Poultry By-Products as Gelatin Source and Quality Assessment
Authors: Elif Tugce Aksun Tumerkan, Umran Cansu, Gokhan Boran, Fatih Ozogul
Abstract:
Gelatin is a mixture of peptides obtained from collagen by partial thermal hydrolysis. It is an important and useful biopolymer that is used in the food, pharmacy, and photography products. Generally, gelatins are sourced from pig skin and bones, beef bone and hide, but within the last decade, using alternative gelatin resources has attracted some interest. In this study, functional properties of gelatin extracted from seafood and poultry by-products were evaluated. For this purpose, skins of skipjack tuna (Katsuwonus pelamis) and frog (Rana esculata) were used as seafood by-products and chicken skin as poultry by-product as raw material for gelatin extraction. Following the extraction of gelatin, all samples were lyophilized and stored in plastic bags at room temperature. For comparing gelatins obtained; chemical composition, common quality parameters including bloom value, gel strength, and viscosity in addition to some others like melting and gelling temperatures, hydroxyproline content, and colorimetric parameters were determined. The results showed that the highest protein content obtained in frog gelatin with 90.1% and the highest hydroxyproline content was in chicken gelatin with 7.6% value. Frog gelatin showed a significantly higher (P < 0.05) melting point (42.7°C) compared to that of fish (29.7°C) and chicken (29.7°C) gelatins. The bloom value of gelatin from frog skin was found higher (363 g) than chicken and fish gelatins (352 and 336 g, respectively) (P < 0.05). While fish gelatin had higher lightness (L*) value (92.64) compared to chicken and frog gelatins, redness/greenness (a*) value was significantly higher in frog skin gelatin. Based on the results obtained, it can be concluded that skins of different animals with high commercial value may be utilized as alternative sources to produce gelatin with high yield and desirable functional properties. Functional and quality analysis of gelatin from frog, chicken, and tuna skin showed by-product of poultry and seafood can be used as an alternative gelatine source to mammalian gelatine. The functional properties, including bloom strength, melting points, and viscosity of gelatin from frog skin were more admirable than that of the chicken and tuna skin. Among gelatin groups, significant characteristic differences such as gel strength and physicochemical properties were observed based on not only raw material but also the extraction method.Keywords: chicken skin, fish skin, food industry, frog skin, gel strength
Procedia PDF Downloads 16331088 Enhanced Multi-Scale Feature Extraction Using a DCNN by Proposing Dynamic Soft Margin SoftMax for Face Emotion Detection
Authors: Armin Nabaei, M. Omair Ahmad, M. N. S. Swamy
Abstract:
Many facial expression and emotion recognition methods in the traditional approaches of using LDA, PCA, and EBGM have been proposed. In recent years deep learning models have provided a unique platform addressing by automatically extracting the features for the detection of facial expression and emotions. However, deep networks require large training datasets to extract automatic features effectively. In this work, we propose an efficient emotion detection algorithm using face images when only small datasets are available for training. We design a deep network whose feature extraction capability is enhanced by utilizing several parallel modules between the input and output of the network, each focusing on the extraction of different types of coarse features with fined grained details to break the symmetry of produced information. In fact, we leverage long range dependencies, which is one of the main drawback of CNNs. We develop this work by introducing a Dynamic Soft-Margin SoftMax.The conventional SoftMax suffers from reaching to gold labels very soon, which take the model to over-fitting. Because it’s not able to determine adequately discriminant feature vectors for some variant class labels. We reduced the risk of over-fitting by using a dynamic shape of input tensor instead of static in SoftMax layer with specifying a desired Soft- Margin. In fact, it acts as a controller to how hard the model should work to push dissimilar embedding vectors apart. For the proposed Categorical Loss, by the objective of compacting the same class labels and separating different class labels in the normalized log domain.We select penalty for those predictions with high divergence from ground-truth labels.So, we shorten correct feature vectors and enlarge false prediction tensors, it means we assign more weights for those classes with conjunction to each other (namely, “hard labels to learn”). By doing this work, we constrain the model to generate more discriminate feature vectors for variant class labels. Finally, for the proposed optimizer, our focus is on solving weak convergence of Adam optimizer for a non-convex problem. Our noteworthy optimizer is working by an alternative updating gradient procedure with an exponential weighted moving average function for faster convergence and exploiting a weight decay method to help drastically reducing the learning rate near optima to reach the dominant local minimum. We demonstrate the superiority of our proposed work by surpassing the first rank of three widely used Facial Expression Recognition datasets with 93.30% on FER-2013, and 16% improvement compare to the first rank after 10 years, reaching to 90.73% on RAF-DB, and 100% k-fold average accuracy for CK+ dataset, and shown to provide a top performance to that provided by other networks, which require much larger training datasets.Keywords: computer vision, facial expression recognition, machine learning, algorithms, depp learning, neural networks
Procedia PDF Downloads 7431087 Remote Sensing and GIS-Based Environmental Monitoring by Extracting Land Surface Temperature of Abbottabad, Pakistan
Authors: Malik Abid Hussain Khokhar, Muhammad Adnan Tahir, Hisham Bin Hafeez Awan
Abstract:
Continuous environmental determinism and climatic change in the entire globe due to increasing land surface temperature (LST) has become a vital phenomenon nowadays. LST is accelerating because of increasing greenhouse gases in the environment which results of melting down ice caps, ice sheets and glaciers. It has not only worse effects on vegetation and water bodies of the region but has also severe impacts on monsoon areas in the form of capricious rainfall and monsoon failure extensive precipitation. Environment can be monitored with the help of various geographic information systems (GIS) based algorithms i.e. SC (Single), DA (Dual Angle), Mao, Sobrino and SW (Split Window). Estimation of LST is very much possible from digital image processing of satellite imagery. This paper will encompass extraction of LST of Abbottabad using SW technique of GIS and Remote Sensing over last ten years by means of Landsat 7 ETM+ (Environmental Thematic Mapper) and Landsat 8 vide their Thermal Infrared (TIR Sensor) and Optical Land Imager (OLI sensor less Landsat 7 ETM+) having 100 m TIR resolution and 30 m Spectral Resolutions. These sensors have two TIR bands each; their emissivity and spectral radiance will be used as input statistics in SW algorithm for LST extraction. Emissivity will be derived from Normalized Difference Vegetation Index (NDVI) threshold methods using 2-5 bands of OLI with the help of e-cognition software, and spectral radiance will be extracted TIR Bands (Band 10-11 and Band 6 of Landsat 7 ETM+). Accuracy of results will be evaluated by weather data as well. The successive research will have a significant role for all tires of governing bodies related to climate change departments.Keywords: environment, Landsat 8, SW Algorithm, TIR
Procedia PDF Downloads 35531086 Recognition of Tifinagh Characters with Missing Parts Using Neural Network
Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui
Abstract:
In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN
Procedia PDF Downloads 33431085 The Rayleigh Quotient for Structural Element Vibration Analysis with Finite Element Method
Authors: Falek Kamel
Abstract:
Various approaches are usually used in the dynamic analysis of beams vibrating transversally. For this, numerical methods allowing the solving of the general eigenvalue problem are utilized. The equilibrium equations describe the movement resulting from the solution of a fourth-order differential equation. Our investigation is based on the finite element method. The findings of these investigations are the vibration frequencies obtained by the Jacobi method. Two types of the elementary mass matrix are considered, representing a uniform distribution of the mass along with the element and concentrated ones located at fixed points whose number is increased progressively separated by equal distances at each evaluation stage. The studied beams have different boundary constraints representing several classical situations. Comparisons are made for beams where the distributed mass is replaced by n concentrated masses. As expected, the first calculus stage is to obtain the lowest number of beam parts that gives a frequency comparable to that issued from the Rayleigh formula. The obtained values are then compared to theoretical results based on the assumptions of the Bernoulli-Euler theory. These steps are used for the second type of mass representation in the same manner.Keywords: structural elements, beams vibrating, dynamic analysis, finite element method, Jacobi method
Procedia PDF Downloads 16331084 A Hybrid Block Multistep Method for Direct Numerical Integration of Fourth Order Initial Value Problems
Authors: Adamu S. Salawu, Ibrahim O. Isah
Abstract:
Direct solution to several forms of fourth-order ordinary differential equations is not easily obtained without first reducing them to a system of first-order equations. Thus, numerical methods are being developed with the underlying techniques in the literature, which seeks to approximate some classes of fourth-order initial value problems with admissible error bounds. Multistep methods present a great advantage of the ease of implementation but with a setback of several functions evaluation for every stage of implementation. However, hybrid methods conventionally show a slightly higher order of truncation for any k-step linear multistep method, with the possibility of obtaining solutions at off mesh points within the interval of solution. In the light of the foregoing, we propose the continuous form of a hybrid multistep method with Chebyshev polynomial as a basis function for the numerical integration of fourth-order initial value problems of ordinary differential equations. The basis function is interpolated and collocated at some points on the interval [0, 2] to yield a system of equations, which is solved to obtain the unknowns of the approximating polynomial. The continuous form obtained, its first and second derivatives are evaluated at carefully chosen points to obtain the proposed block method needed to directly approximate fourth-order initial value problems. The method is analyzed for convergence. Implementation of the method is done by conducting numerical experiments on some test problems. The outcome of the implementation of the method suggests that the method performs well on problems with oscillatory or trigonometric terms since the approximations at several points on the solution domain did not deviate too far from the theoretical solutions. The method also shows better performance compared with an existing hybrid method when implemented on a larger interval of solution.Keywords: Chebyshev polynomial, collocation, hybrid multistep method, initial value problems, interpolation
Procedia PDF Downloads 12231083 Cephalometric Changes of Patient with Class II Division 1 [Malocclusion] Post Orthodontic Treatment with Growth Stimulation: A Case Report
Authors: Pricillia Priska Sianita
Abstract:
An aesthetic facial profile is one of the goals in Orthodontics treatment. However, this is not easily achieved, especially in patients with Class II Division 1 malocclusion who have the clinical characteristics of convex profile and significant skeletal discrepancy due to mandibular growth deficiency. Malocclusion with skeletal problems require proper treatment timing for growth stimulation, and it must be done in early age and in need of good cooperation from the patient. If this is not done and the patient has passed the growth period, the ideal treatment is orthognathic surgery which is more complicated and more painful. The growth stimulation of skeletal malocclusion requires a careful cephalometric evaluation ranging from diagnosis to determine the parts that require stimulation to post-treatment evaluation to see the success achieved through changes in the measurement of the skeletal parameters shown in the cephalometric analysis. This case report aims to describe skeletal changes cephalometrically that were achieved through orthodontic treatment in growing period. Material and method: Lateral Cephalograms, pre-treatment, and post-treatment of cases of Class II Division 1 malocclusion is selected from a collection of cephalometric radiographic in a private clinic. The Cephalogram is then traced and measured for the skeletal parameters. The result is noted as skeletal condition data of pre-treatment and post-treatment. Furthermore, superimposition is done to see the changes achieved. The results show that growth stimulation through orthodontic treatment can solve the skeletal problem of Class II Division 1 malocclusion and the skeletal changes that occur can be verified through cephalometric analysis. The skeletal changes have an impact on the improvement of patient's facial profile. To sum up, the treatment timing on a skeletal malocclusion is very important to obtain satisfactory results for the improvement of the aesthetic facial profile, and skeletal changes can be verified through cephalometric evaluation of pre- and post-treatment.Keywords: cephalometric evaluation, class II division 1 malocclusion, growth stimulation, skeletal changes, skeletal problems
Procedia PDF Downloads 24931082 Analyzing the Efficiency of Several Gum Extraction Tapping Systems for Wood Apple Trees
Authors: K. M. K. D Weerasekara, R. M. K. M Rathnayake, R. U. Halwatura, G. Y. Jayasinghe
Abstract:
Wood apple (Limonia acidissima L.) trees are native to Sri Lanka and India. Wood apple gum is widely used in the food, coating, and pharmaceutical industries. Wood apple gum was a major component in ancient Sri Lankan coating technology as well. It is also used as a suspending agent in liquid syrups and food ingredients such as sauces, emulsifiers, and stabilizers. Industrial applications include adhesives for labeling and packaging, as well as paint binder. It is also used in the production of paper and cosmetics. Extraction of wood apple gum is an important step in ensuring maximum benefits for various uses. It is apparent that an abundance of untapped potential lies in wood apple gum if people are able to mass produce them. Hence, the current study uses a two-factor factorial design with two major variables and four replications to investigate the best gum-extracting tapping system for Wood apple gum. This study's findings will be useful to Wood apple cultivators, researchers, and gum-based industries alike.Keywords: wood apple gum, limonia acidissima l., tapping, tapping cuts
Procedia PDF Downloads 7431081 Roof and Road Network Detection through Object Oriented SVM Approach Using Low Density LiDAR and Optical Imagery in Misamis Oriental, Philippines
Authors: Jigg L. Pelayo, Ricardo G. Villar, Einstine M. Opiso
Abstract:
The advances of aerial laser scanning in the Philippines has open-up entire fields of research in remote sensing and machine vision aspire to provide accurate timely information for the government and the public. Rapid mapping of polygonal roads and roof boundaries is one of its utilization offering application to disaster risk reduction, mitigation and development. The study uses low density LiDAR data and high resolution aerial imagery through object-oriented approach considering the theoretical concept of data analysis subjected to machine learning algorithm in minimizing the constraints of feature extraction. Since separating one class from another in distinct regions of a multi-dimensional feature-space, non-trivial computing for fitting distribution were implemented to formulate the learned ideal hyperplane. Generating customized hybrid feature which were then used in improving the classifier findings. Supplemental algorithms for filtering and reshaping object features are develop in the rule set for enhancing the final product. Several advantages in terms of simplicity, applicability, and process transferability is noticeable in the methodology. The algorithm was tested in the different random locations of Misamis Oriental province in the Philippines demonstrating robust performance in the overall accuracy with greater than 89% and potential to semi-automation. The extracted results will become a vital requirement for decision makers, urban planners and even the commercial sector in various assessment processes.Keywords: feature extraction, machine learning, OBIA, remote sensing
Procedia PDF Downloads 36231080 A Validated UPLC-MS/MS Assay Using Negative Ionization Mode for High-Throughput Determination of Pomalidomide in Rat Plasma
Authors: Muzaffar Iqbal, Essam Ezzeldin, Khalid A. Al-Rashood
Abstract:
Pomalidomide is a second generation oral immunomodulatory agent, being used for the treatment of multiple myeloma in patients with disease refractory to lenalidomide and bortezomib. In this study, a sensitive UPLC-MS/MS assay was developed and validated for high-throughput determination of pomalidomide in rat plasma using celecoxib as an internal standard (IS). Liquid liquid extraction using dichloromethane as extracting agent was employed to extract pomalidomide and IS from 200 µL of plasma. Chromatographic separation was carried on Acquity BEHTM C18 column (50 × 2.1 mm, 1.7 µm) using an isocratic mobile phase of acetonitrile:10 mM ammonium acetate (80:20, v/v), at a flow rate of 0.250 mL/min. Both pomalidomide and IS were eluted at 0.66 ± 0.03 and 0.80 ± 0.03 min, respectively with a total run time of 1.5 min only. Detection was performed on a triple quadrupole tandem mass spectrometer using electrospray ionization in negative mode. The precursor to product ion transitions of m/z 272.01 → 160.89 for pomalidomide and m/z 380.08 → 316.01 for IS were used to quantify them respectively, using multiple reaction monitoring mode. The developed method was validated according to regulatory guideline for bioanalytical method validation. The linearity in plasma sample was achieved in the concentration range of 0.47–400 ng/mL (r2 ≥ 0.997). The intra and inter-day precision values were ≤ 11.1% (RSD, %) whereas accuracy values ranged from - 6.8 – 8.5% (RE, %). In addition, other validation results were within the acceptance criteria and the method was successfully applied in a pharmacokinetic study of pomalidomide in rats.Keywords: pomalidomide, pharmacokinetics, LC-MS/MS, celecoxib
Procedia PDF Downloads 39131079 Taking Learning beyond Kirkpatrick’s Levels: Applying Return on Investment Measurement in Training
Authors: Charles L. Sigmund, M. A. Aed, Lissa Graciela Rivera Picado
Abstract:
One critical component of the training development process is the evaluation of the impact and value of the program. Oftentimes, however, learning organizations bypass this phase either because they are unfamiliar with effective methods for measuring the success or effect of the training or because they believe the effort to be too time-consuming or cumbersome. As a result, most organizations that do conduct evaluation limit their scope to Kirkpatrick L1 (reaction) and L2 (learning), or at most carry through to L4 (results). In 2021 Microsoft made a strategic decision to assess the measurable and monetized impact for all training launches and designed a scalable and program-agnostic tool for providing full-scale L5 return on investment (ROI) estimates for each. In producing this measurement tool, the learning and development organization built a framework for making business prioritizations and resource allocations that is based on the projected ROI of a course. The analysis and measurement posed by this process use a combination of training data and operational metrics to calculate the effective net benefit derived from a given training effort. Business experts in the learning field generally consider a 10% ROI to be an outstanding demonstration of the value of a project. Initial findings from this work applied to a critical customer-facing program yielded an estimated ROI of more than 49%. This information directed the organization to make a more concerted and concentrated effort in this specific line of business and resulted in additional investment in the training methods and technologies being used.Keywords: evaluation, measurement, return on investment, value
Procedia PDF Downloads 18531078 Trace Network: A Probabilistic Relevant Pattern Recognition Approach to Attribution Trace Analysis
Authors: Jian Xu, Xiaochun Yun, Yongzheng Zhang, Yafei Sang, Zhenyu Cheng
Abstract:
Network attack prevention is a critical research area of information security. Network attack would be oppressed if attribution techniques are capable to trace back to the attackers after the hacking event. Therefore attributing these attacks to a particular identification becomes one of the important tasks when analysts attempt to differentiate and profile the attacker behind a piece of attack trace. To assist analysts in expose attackers behind the scenes, this paper researches on the connections between attribution traces and proposes probabilistic relevance based attribution patterns. This method facilitates the evaluation of the plausibility relevance between different traceable identifications. Furthermore, through analyzing the connections among traces, it could confirm the existence probability of a certain organization as well as discover its affinitive partners by the means of drawing relevance matrix from attribution traces.Keywords: attribution trace, probabilistic relevance, network attack, attacker identification
Procedia PDF Downloads 36631077 Cross Matching: An Improved Method to Obtain Comprehensive and Consolidated Evidence
Authors: Tuula Heinonen, Wilhelm Gaus
Abstract:
At present safety, assessment starts with animal tests although their predictivity is often poor. Even after extended human use experimental data are often judged as the core information for risk assessment. However, the best opportunity to generate true evidence is to match all available information. Cross matching methodology combines the different fields of knowledge and types of data (e.g. in-vitro and in-vivo experiments, clinical observations, clinical and epidemiological studies, and daily life observations) and gives adequate weight to individual findings. To achieve a consolidated outcome, the information from all available sources is analysed and compared with each other. If single pieces of information fit together a clear picture becomes visible. If pieces of information are inconsistent or contradictory careful consideration is necessary. 'Cross' can be understood as 'orthographic' in geometry or as 'independent' in mathematics. Results coming from different sources bring independent and; therefore, they result in new information. Independent information gives a larger contribution to evidence than results coming repeatedly from the same source. A successful example of cross matching is the assessment of Ginkgo biloba where we were able to come to the conclusive result: Ginkgo biloba leave extract is well tolerated and safe for humans.Keywords: cross-matching, human use, safety assessment, Ginkgo biloba leave extract
Procedia PDF Downloads 28731076 Evaluation of External Costs of Traffic Accident in Slovak Republic
Authors: Anna Dolinayova, Jozef Danis, Juraj Camaj
Abstract:
The report deals with comparison of traffic accidents in Slovak republic in road and rail transport since year 2009 until 2014, with evaluation of external costs and consequently with the possibilities of their internalization. The results of road traffic accidents analysis are realized in line with after-effects they have caused; in line with main cause, place of origin (within or out of town) and in accordance to age of people they were killed or hard, eventually easy injured in traffic accidents. Evaluation of individual after-effects is carried in terms of probability of traffic accidents occurrence.Keywords: external costs, traffic accident, rail transport, road transport
Procedia PDF Downloads 59431075 The Effect of Different Concentrations of Extracting Solvent on the Polyphenolic Content and Antioxidant Activity of Gynura procumbens Leaves
Authors: Kam Wen Hang, Tan Kee Teng, Huang Poh Ching, Chia Kai Xiang, H. V. Annegowda, H. S. Naveen Kumar
Abstract:
Gynura procumbens (G. procumbens) leaves, commonly known as ‘sambung nyawa’ in Malaysia is a well-known medicinal plant commonly used as folk medicines in controlling blood glucose, cholesterol level as well as treating cancer. These medicinal properties were believed to be related to the polyphenolic content present in G. procumbens extract, therefore optimization of its extraction process is vital to obtain highest possible antioxidant activities. The current study was conducted to investigate the effect of different concentrations of extracting solvent (ethanol) on the amount of polyphenolic content and antioxidant activities of G. procumbens leaf extract. The concentrations of ethanol used were 30-70%, with the temperature and time kept constant at 50°C and 30 minutes, respectively using ultrasound-assisted extraction. The polyphenolic content of these extracts were quantified by Folin-Ciocalteu colorimetric method and results were expressed as milligram gallic acid equivalent (mg GAE)/g. Phosphomolybdenum method and 1, 1-diphenyl-2-picrylhydrazyl (DPPH) radical scavenging assays were used to investigate the antioxidant properties of the extract and the results were expressed as milligram ascorbic acid equivalent (mg AAE)/g and effective concentration (EC50) respectively. Among the three different (30%, 50% and 70%) concentrations of ethanol studied, the 50% ethanolic extract showed total phenolic content of 31.565 ± 0.344 mg GAE/g and total antioxidant activity of 78.839 ± 0.199 mg AAE/g while 30% ethanolic extract showed 29.214 ± 0.645 mg GAE/g and 70.701 ± 1.394 mg AAE/g, respectively. With respect to DPPH radical scavenging assay, 50% ethanolic extract had exhibited slightly lower EC50 (314.3 ± 4.0 μg/ml) values compared to 30% ethanol extract (340.4 ± 5.3 μg/ml). Out of all the tested extracts, 70% ethanolic extract exhibited significantly (p< 0.05) highest total phenolic content (38.000 ± 1.009 mg GAE/g), total antioxidant capacity (95.874 ± 2.422 mg AAE/g) and demonstrated the lowest EC50 in DPPH assay (244.2 ± 5.9 μg/ml). An excellent correlations were drawn between total phenolic content, total antioxidant capacity and DPPH radical scavenging activity (R2 = 0.949 and R2 = 0.978, respectively). It was concluded from this study that, 70% ethanol should be used as the optimal polarity solvent to obtain G. procumbens leaf extract with maximum polyphenolic content with antioxidant properties.Keywords: antioxidant activity, DPPH assay, Gynura procumbens, phenolic compounds
Procedia PDF Downloads 41131074 Biophysically Motivated Phylogenies
Authors: Catherine Felce, Lior Pachter
Abstract:
Current methods for building phylogenetic trees from gene expression data consider mean expression levels. With single-cell technologies, we can leverage more information about cell dynamics by considering the entire distribution of gene expression across cells. Using biophysical modeling, we propose a method for constructing phylogenetic trees from scRNA-seq data, building on Felsenstein's method of continuous characters. This method can highlight genes whose level of expression may be unchanged between species, but whose rates of transcription/decay may have evolved over time.Keywords: phylogenetics, single-cell, biophysical modeling, transcription
Procedia PDF Downloads 5031073 Development of a Triangular Evaluation Protocol in a Multidisciplinary Design Process of an Ergometric Step
Authors: M. B. Ricardo De Oliveira, A. Borghi-Silva, E. Paravizo, F. Lizarelli, L. Di Thomazzo, D. Braatz
Abstract:
Prototypes are a critical feature in the product development process, as they help the project team visualize early concept flaws, communicate ideas and introduce an initial product testing. Involving stakeholders, such as consumers and users, in prototype tests allows the gathering of valuable feedback, contributing for a better product and making the design process more participatory. Even though recent studies have shown that user evaluation of prototypes is valuable, few articles provide a method or protocol on how designers should conduct it. This multidisciplinary study (involving the areas of physiotherapy, engineering and computer science) aims to develop an evaluation protocol, using an ergometric step prototype as the product prototype to be assessed. The protocol consisted of performing two tests (the 2 Minute Step Test and the Portability Test) to allow users (patients) and consumers (physiotherapists) to have an experience with the prototype. Furthermore, the protocol contained four Likert-Scale questionnaires (one for users and three for consumers), that inquired participants about how they perceived the design characteristics of the product (performance, safety, materials, maintenance, portability, usability and ergonomics), in their use of the prototype. Additionally, the protocol indicated the need to conduct interviews with the product designers, in order to link their feedback to the ones from the consumers and users. Both tests and interviews were recorded for further analysis. The participation criteria for the study was gender and age for patients, gender and experience with 2 Minute Step Test for physiotherapists and involvement level in the product development project for designers. The questionnaire's reliability was validated using Cronbach's Alpha and the quantitative data of the questionnaires were analyzed using non-parametric hypothesis tests with a significance level of 0.05 (p <0.05) and descriptive statistics. As a result, this study provides a concise evaluation protocol which can assist designers in their development process, collecting quantitative feedback from consumer and users, and qualitative feedback from designers.Keywords: Product Design, Product Evaluation, Prototypes, Step
Procedia PDF Downloads 11831072 Developing an Out-of-Distribution Generalization Model Selection Framework through Impurity and Randomness Measurements and a Bias Index
Authors: Todd Zhou, Mikhail Yurochkin
Abstract:
Out-of-distribution (OOD) detection is receiving increasing amounts of attention in the machine learning research community, boosted by recent technologies, such as autonomous driving and image processing. This newly-burgeoning field has called for the need for more effective and efficient methods for out-of-distribution generalization methods. Without accessing the label information, deploying machine learning models to out-of-distribution domains becomes extremely challenging since it is impossible to evaluate model performance on unseen domains. To tackle this out-of-distribution detection difficulty, we designed a model selection pipeline algorithm and developed a model selection framework with different impurity and randomness measurements to evaluate and choose the best-performing models for out-of-distribution data. By exploring different randomness scores based on predicted probabilities, we adopted the out-of-distribution entropy and developed a custom-designed score, ”CombinedScore,” as the evaluation criterion. This proposed score was created by adding labeled source information into the judging space of the uncertainty entropy score using harmonic mean. Furthermore, the prediction bias was explored through the equality of opportunity violation measurement. We also improved machine learning model performance through model calibration. The effectiveness of the framework with the proposed evaluation criteria was validated on the Folktables American Community Survey (ACS) datasets.Keywords: model selection, domain generalization, model fairness, randomness measurements, bias index
Procedia PDF Downloads 12431071 Developing a Web-Based Tender Evaluation System Based on Fuzzy Multi-Attributes Group Decision Making for Nigerian Public Sector Tendering
Authors: Bello Abdullahi, Yahaya M. Ibrahim, Ahmed D. Ibrahim, Kabir Bala
Abstract:
Public sector tendering has traditionally been conducted using manual paper-based processes which are known to be inefficient, less transparent and more prone to manipulations and errors. The advent of the Internet and the World Wide Web has led to the development of numerous e-Tendering systems that addressed some of the problems associated with the manual paper-based tendering system. However, most of these systems rarely support the evaluation of tenders and where they do it is mostly based on the single decision maker which is not suitable in public sector tendering, where for the sake of objectivity, transparency, and fairness, it is required that the evaluation is conducted through a tender evaluation committee. Currently, in Nigeria, the public tendering process in general and the evaluation of tenders, in particular, are largely conducted using manual paper-based processes. Automating these manual-based processes to digital-based processes can help in enhancing the proficiency of public sector tendering in Nigeria. This paper is part of a larger study to develop an electronic tendering system that supports the whole tendering lifecycle based on Nigerian procurement law. Specifically, this paper presents the design and implementation of part of the system that supports group evaluation of tenders based on a technique called fuzzy multi-attributes group decision making. The system was developed using Object-Oriented methodologies and Unified Modelling Language and hypothetically applied in the evaluation of technical and financial proposals submitted by bidders. The system was validated by professionals with extensive experiences in public sector procurement. The results of the validation showed that the system called NPS-eTender has an average rating of 74% with respect to correct and accurate modelling of the existing manual tendering domain and an average rating of 67.6% with respect to its potential to enhance the proficiency of public sector tendering in Nigeria. Thus, based on the results of the validation, the automation of the evaluation process to support tender evaluation committee is achievable and can lead to a more proficient public sector tendering system.Keywords: e-Tendering, e-Procurement, group decision making, tender evaluation, tender evaluation committee, UML, object-oriented methodologies, system development
Procedia PDF Downloads 26131070 External Program Evaluation: Impacts and Changes on Government-Assisted Refugee Mothers
Authors: Akiko Ohta, Masahiro Minami, Yusra Qadir, Jennifer York
Abstract:
The Home Instruction for Parents of Preschool Youngsters (HIPPY) is a home instruction program for mothers of children 3 to 5 years old. Using role-play as a method of teaching, the participating mothers work with their home visitors and learn how to deliver the HIPPY curriculum to their children. Applying HIPPY, Reviving Hope and Home for High-risk Refugee Mothers Program (RHH) was created to provide more personalized peer support and to respond to ongoing settlement challenges for isolated and vulnerable Government Assisted Refugee (GAR) mothers. GARs often have greater needs and vulnerabilities than other refugee groups. While the support is available, they often face various challenges and barriers in starting their new lives in Canada, such as inadequate housing, low first-language literacy levels, low competency in English or French, and social isolation. The pilot project was operated by Mothers Matter Centre (MMC) from January 2019 to March 2021 in partnership with the Immigrant Services Society of BC (ISSofBC). The formative evaluation was conducted by a research team at Simon Fraser University. In order to provide more suitable support for GAR mothers, RHH intended to offer more flexibility in HIPPY delivery, supported by a home visitor, to meet the need of refugee mothers facing various conditions and challenges; to have a pool of financial resources to be used for the RHH families when necessitated during the program period; to have another designated staff member, called a community navigator, assigned to facilitate the support system for the RHH families in their settlement; to have a portable device available for each RHH mother to navigate settlement support resources; and to provide other variations of the HIPPY curriculum as an option for the RHH mothers, including a curriculum targeting pre-HIPPY age children. Reflections on each program component was collected from RHH mothers and staff members of MMC and ISSofBC, including frontline workers and management staff, through individual interviews and focus group discussions. Each of the RHH program components was analyzed and evaluated by applying Moore’s four domains framework to identify key information and generate new knowledge (data). To capture RHH mothers’ program experience more in depth based on their own reflections, the photovoice method was used. Some photos taken by the mothers will be shared to illustrate their RHH experience as part of their life stories. Over the period of the program, this evaluation observed how RHH mothers became more confident in various domains, such as communicating with others, taking public transportations alone, and teaching their own child(ren). One of the major factors behind the success was their home visitors’ flexibility and creativity to create a more meaningful and tailored approach for each mother, depending on her background and personal situation. The role of the community navigator was tested out and improved during the program period. The community navigators took the key role to assess the needs of the RHH families and connect them with community resources. Both the home visitors and community navigators were immigrant mothers themselves and owing to their dedicated care for the RHH mothers; they were able to gain trust and work closely and efficiently with RHH mothers.Keywords: refugee mothers, settlement support, program evaluation, Canada
Procedia PDF Downloads 17131069 Arithmetic Operations Based on Double Base Number Systems
Authors: K. Sanjayani, C. Saraswathy, S. Sreenivasan, S. Sudhahar, D. Suganya, K. S. Neelukumari, N. Vijayarangan
Abstract:
Double Base Number System (DBNS) is an imminent system of representing a number using two bases namely 2 and 3, which has its application in Elliptic Curve Cryptography (ECC) and Digital Signature Algorithm (DSA).The previous binary method representation included only base 2. DBNS uses an approximation algorithm namely, Greedy Algorithm. By using this algorithm, the number of digits required to represent a larger number is less when compared to the standard binary method that uses base 2 algorithms. Hence, the computational speed is increased and time being reduced. The standard binary method uses binary digits 0 and 1 to represent a number whereas the DBNS method uses binary digit 1 alone to represent any number (canonical form). The greedy algorithm uses two ways to represent the number, one is by using only the positive summands and the other is by using both positive and negative summands. In this paper, arithmetic operations are used for elliptic curve cryptography. Elliptic curve discrete logarithm problem is the foundation for most of the day to day elliptic curve cryptography. This appears to be a momentous hard slog compared to digital logarithm problem. In elliptic curve digital signature algorithm, the key generation requires 160 bit of data by usage of standard binary representation. Whereas, the number of bits required generating the key can be reduced with the help of double base number representation. In this paper, a new technique is proposed to generate key during encryption and extraction of key in decryption.Keywords: cryptography, double base number system, elliptic curve cryptography, elliptic curve digital signature algorithm
Procedia PDF Downloads 39631068 A Fuzzy-Rough Feature Selection Based on Binary Shuffled Frog Leaping Algorithm
Authors: Javad Rahimipour Anaraki, Saeed Samet, Mahdi Eftekhari, Chang Wook Ahn
Abstract:
Feature selection and attribute reduction are crucial problems, and widely used techniques in the field of machine learning, data mining and pattern recognition to overcome the well-known phenomenon of the Curse of Dimensionality. This paper presents a feature selection method that efficiently carries out attribute reduction, thereby selecting the most informative features of a dataset. It consists of two components: 1) a measure for feature subset evaluation, and 2) a search strategy. For the evaluation measure, we have employed the fuzzy-rough dependency degree (FRFDD) of the lower approximation-based fuzzy-rough feature selection (L-FRFS) due to its effectiveness in feature selection. As for the search strategy, a modified version of a binary shuffled frog leaping algorithm is proposed (B-SFLA). The proposed feature selection method is obtained by hybridizing the B-SFLA with the FRDD. Nine classifiers have been employed to compare the proposed approach with several existing methods over twenty two datasets, including nine high dimensional and large ones, from the UCI repository. The experimental results demonstrate that the B-SFLA approach significantly outperforms other metaheuristic methods in terms of the number of selected features and the classification accuracy.Keywords: binary shuffled frog leaping algorithm, feature selection, fuzzy-rough set, minimal reduct
Procedia PDF Downloads 22531067 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics
Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur
Abstract:
Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics
Procedia PDF Downloads 10931066 Proposal Evaluation of Critical Success Factors (CSF) in Lean Manufacturing Projects
Authors: Guilherme Gorgulho, Carlos Roberto Camello Lima
Abstract:
Critical success factors (CSF) are used to design the practice of project management that can lead directly or indirectly to the success of the project. This management includes many elements that have to be synchronized in order to ensure the project on-time delivery, quality and the lowest possible cost. The objective of this work is to develop a proposal for evaluation of the FCS in lean manufacturing projects, and apply the evaluation in a pilot project. The results show that the use of continuous improvement programs in organizations brings benefits as the process cost reduction and improve productivity.Keywords: continuous improvement, critical success factors (csf), lean thinking, project management
Procedia PDF Downloads 36431065 Implementation and Comparative Analysis of PET and CT Image Fusion Algorithms
Authors: S. Guruprasad, M. Z. Kurian, H. N. Suma
Abstract:
Medical imaging modalities are becoming life saving components. These modalities are very much essential to doctors for proper diagnosis, treatment planning and follow up. Some modalities provide anatomical information such as Computed Tomography (CT), Magnetic Resonance Imaging (MRI), X-rays and some provides only functional information such as Positron Emission Tomography (PET). Therefore, single modality image does not give complete information. This paper presents the fusion of structural information in CT and functional information present in PET image. This fused image is very much essential in detecting the stages and location of abnormalities and in particular very much needed in oncology for improved diagnosis and treatment. We have implemented and compared image fusion techniques like pyramid, wavelet, and principal components fusion methods along with hybrid method of DWT and PCA. The performances of the algorithms are evaluated quantitatively and qualitatively. The system is implemented and tested by using MATLAB software. Based on the MSE, PSNR and ENTROPY analysis, PCA and DWT-PCA methods showed best results over all experiments.Keywords: image fusion, pyramid, wavelets, principal component analysis
Procedia PDF Downloads 28331064 The Extraction of Sage Essential Oil and the Improvement of Sleeping Quality for Female Menopause by Sage Essential Oil
Authors: Bei Shan Lin, Tzu Yu Huang, Ya Ping Chen, Chun Mel Lu
Abstract:
This research is divided into two parts. The first part is to adopt the method of supercritical carbon dioxide fluid extraction to extract sage essential oil (Salvia officinalis) and to find out the differences when the procedure is under different pressure conditions. Meanwhile, this research is going to probe into the composition of the extracted sage essential oil. The second part will talk about the effect of the aromatherapy with extracted sage essential oil to improve the sleeping quality for women in menopause. The extracted sage substance is tested by inhibiting DPPH radical to identify its antioxidant capacity, and the extracted component was analyzed by gas chromatography-mass spectrometer. Under two different pressure conditions, the extracted experiment gets different results. By 3000 psi, the extracted substance is IC50 180.94mg/L, which is higher than IC50 657.43mg/L by 1800 psi. By 3000 psi, the extracted yield is 1.05%, which is higher than 0.68% by 1800 psi. Through the experimental data, the researcher also can conclude that the extracted substance with 3000psi contains more materials than the one with 1800 psi. The main overlapped materials are the compounds of cyclic ether, flavonoid, and terpenes. Cyclic ether and flavonoids have the function of soothing and calming. They can be applied to relieve cramps and to eliminate menopause disorders. The second part of the research is to apply extracted sage essential oil to aromatherapy for women who are in menopause and to discuss the effect of the improvement for the sleeping quality. This research adopts the approaching of Swedish upper back massage, evaluates the sleeping quality with the Pittsburgh Sleep Quality Index, and detects the changes with heart rate variability apparatus. The experimental group intervenes with extracted sage essential oil to the aromatherapy. The average heart beats detected by the apparatus has a better result in SDNN, low frequency, and high frequency. The performance is better than the control group. According to the statistical analysis of the Pittsburgh Sleep Quality Index, this research has reached the effect of sleep quality improvement. It proves that extracted sage essential oil has a significant effect on increasing the activities of parasympathetic nerves. It is able to improve the sleeping quality for women in menopauseKeywords: supercritical carbon dioxide fluid extraction, Salvia officinalis, aromatherapy, Swedish massage, Pittsburgh sleep quality index, heart rate variability, parasympathetic nerves
Procedia PDF Downloads 12031063 Extraction of Osmolytes from the Halotolerant Fungus Aspergillus oryzae
Abstract:
Salin soils occupy about 7% of land area; they are characterized by unsuitable physical conditions for the growth of living organisms. However, researches showed that some microorganisms especially fungi are able to grow and adapt to such extreme conditions; it is due to their ability to develop different physiological mechanisms in their adaptation. The aim of this study is to identify qualitatively the osmolytes that the biotechnological important fungus A. oryzae accumulated and/or produced in its adaptation, which they were detected by Thin-layer chromatography technique (TLC) using several systems, from different media (Wheat brane, MNM medium and MM medium). The results showed that The moderately halotolerant fungus A. oryzae, accumulates mixture of molecules, containing polyols and sugars , some amino acids in addition to some molecules which were not defined. Wheat bran was the best medium for the extraction of these molecules, where the proportion was 85.71%, followed by MNM medium 64.28%, then the minimum medium MM 14.28%. Properties of osmolytes are becoming increasingly useful in molecular biology, agriculture pharmaceutical, medicinal, and biotechnological interests.Keywords: salinity, aspergillus oryzae, halo tolerance, osmolytes, compatible solutes
Procedia PDF Downloads 41531062 Progressive Multimedia Collection Structuring via Scene Linking
Authors: Aman Berhe, Camille Guinaudeau, Claude Barras
Abstract:
In order to facilitate information seeking in large collections of multimedia documents with long and progressive content (such as broadcast news or TV series), one can extract the semantic links that exist between semantically coherent parts of documents, i.e., scenes. The links can then create a coherent collection of scenes from which it is easier to perform content analysis, topic extraction, or information retrieval. In this paper, we focus on TV series structuring and propose two approaches for scene linking at different levels of granularity (episode and season): a fuzzy online clustering technique and a graph-based community detection algorithm. When evaluated on the two first seasons of the TV series Game of Thrones, we found that the fuzzy online clustering approach performed better compared to graph-based community detection at the episode level, while graph-based approaches show better performance at the season level.Keywords: multimedia collection structuring, progressive content, scene linking, fuzzy clustering, community detection
Procedia PDF Downloads 10031061 Effective Editable Emoticon Description Schema for Mobile Applications
Authors: Jiwon Lee, Si-hwan Jang, Sanghyun Joo
Abstract:
The popularity of emoticons are on the rise since the mobile messengers are generalized. At the same time, few problems of emoticons are also occurred due to innate characteristics of emoticons. Too many emoticons make difficult people to select one which is well-suited for user's intention. On the contrary to this, sometimes user cannot find the emoticon which expresses user's exact intention. Poor information delivery of emoticon is another problem due to a major part of current emoticons are focused on emotion delivery. In this situation, we propose a new concept of emoticons, editable emoticons, to solve above drawbacks of emoticons. User can edit the components inside the proposed editable emoticon and send it to express his exact intention. By doing so, the number of editable emoticons can be maintained reasonable, and it can express user's exact intention. Further, editable emoticons can be used as information deliverer according to user's intention and editing skills. In this paper, we propose the concept of editable emoticons and schema based editable emoticon description method. The proposed description method is 200 times superior to the compared screen capturing method in the view of transmission bandwidth. Further, the description method is designed to have compatibility since it follows MPEG-UD international standard. The proposed editable emoticons can be exploited not only mobile applications, but also various fields such as education and medical field.Keywords: description schema, editable emoticon, emoticon transmission, mobile applications
Procedia PDF Downloads 297