Search results for: food composition data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28932

Search results for: food composition data

24162 Brand Tips of Thai Halal Products

Authors: Pibool Waijittragum

Abstract:

The purpose of this research is to analyze the marketing strategies of Thai Halal products which related to the way of life for Thai Muslims. The expected benefit is the marketing strategy for brand building process for Halal products in Thailand. 4 elements of marketing strategies which necessary for the brand identity creation is the research framework: Consists of Attributes, Benefits, Values and Personality. The research methodology was applied using qualitative and quantitative; 19 marketing experts with dynamic roles in Thai consumer products were interviewed. In addition, a field survey of 122 Thai Muslims selected from 175 Muslim communities in Bangkok was studied. Data analysis will be according to 5 categories of Thai Halal product: 1) Meat 2) Vegetable and Fruits 3) Instant foods and Garnishing ingredient 4) Beverages, desserts and snacks 5) Hygienic daily products; such as soap, shampoo and body lotion. The results will explain some suitable representation in the marketing strategies of Thai Halal products as are: 1) Benefit; the characteristics of the product with its benefit. Consumers will purchase this product with the reason of; it is beneficial nutrients product, there are no toxic or chemical residues. Fresh and clean materials 2) Attribute; the exterior images that attract to consumer. Consumers will purchase this product with the reason of; there is a standard proof mark, food and drug secure proof mark and Halal products mark. Packaging and its materials should be draw attention. Use an attractive graphic. Use outstanding images of product, material or ingredients. 3) Value; the value of products that affect to consumers perception; it is healthy products. Accumulate quality of life. It is a product of expertise, manufacturing of research result. Consumers are important. It’s sincere, honest and reliable to all. 4) Personality; reflection of consumers thought. The personality feedback to them after they were consumes this product; they are health care persons. They are the rational person, moral person, justice person and thoughtful person like a progressive thinking.

Keywords: marketing strategies, product identity, branding, Thai Halal products

Procedia PDF Downloads 366
24161 A New Authenticable Steganographic Method via the Use of Numeric Data on Public Websites

Authors: Che-Wei Lee, Bay-Erl Lai

Abstract:

A new steganographic method via the use of numeric data on public websites with self-authentication capability is proposed. The proposed technique transforms a secret message into partial shares by Shamir’s (k, n)-threshold secret sharing scheme with n = k + 1. The generated k+1 partial shares then are embedded into the selected numeric items in a website as if they are part of the website’s numeric content. Afterward, a receiver links to the website and extracts every k shares among the k+1 ones from the stego-numeric-content to compute k+1 copies of the secret, and the phenomenon of value consistency of the computed k+1 copies is taken as an evidence to determine whether the extracted message is authentic or not, attaining the goal of self-authentication of the extracted secret message. Experimental results and discussions are provided to show the feasibility and effectiveness of the proposed method.

Keywords: steganography, data hiding, secret authentication, secret sharing

Procedia PDF Downloads 228
24160 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications

Authors: T. Gangadhararao, K. Krishna Kishore

Abstract:

Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.

Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code

Procedia PDF Downloads 413
24159 Feeding Effects of Increasing Levels of Yerba Mate on Lamb Meat Quality

Authors: Yuli Andrea P. Bermudez, Richard R. Lobo, Tamyres R. D. Amorim, Danny Alexander R. Moreno, Angelica Simone C. Pereira, Ives Claudio D. Bueno

Abstract:

The use of natural antioxidants in animal feed can positively modify the profile of fatty acids (FAs) in meat, due to the presence of secondary metabolites, mainly phenolic and flavonoid compounds, which promote an increase in the associated polyunsaturated fatty acids (PUFA) with beneficial factors in human health. The goal of this study was to evaluate the effect of the dietary inclusion percentage of yerba mate extract (Ilex paraguariensis St. Hilaire) as a natural antioxidant on lamb meat quality. The animals were confined for 53 days and fed with corn silage and concentrated in the proportion of 60:40, respectively, were divided into four homogeneous groups (n = 9 lambs/group), to each of the treatments, one control group without yerba mate extract - YME (0%) and three treatments with 1, 2 and 4% the inclusion of YME on a DM basis. Samples of the Longissimus thoracis (LT) muscle were collected from the deboning of 36 lambs, analyzing pH values, color parameters (brightness: L*, red value: a*, and yellow: b*), fatty acid profile, total lipids, and sensory analysis. The inclusion of YME modified the value of b* (P = 0.0041), indicating a higher value of yellow color in the meat, for the group supplemented with 4% YME. All data were statistically evaluated using the MIXED procedure of the statistical package SAS 9.4. However, it did not show differences in the final live weight in the groups evaluated, as well as in the pH values (P = 0.1923) and the total lipid concentration (P = 0.0752). The FAs (P ≥ 0.1360) and health indexes were not altered by the inclusion of YME (P ≥ 0.1360); only branched-chain fatty acids (BCFA) exhibited a diet effect (P = 0.0092) in the group that had 4% of the extract. In the sensory analysis test with a hedonic scale it did not show differences between the treatments (P ≥ 0.1251). Nevertheless, in the just about-right test, using (note 1) to 'very strong, softness or moist' (note 5); the softness was different between the evaluated treatments (P = 0.0088) where groups with 2% YME had a better acceptance of tasters (4.15 ± 0.08) compared to the control (3.89 ± 0.08). In conclusion, although the addition of YME has shown positive results in sensory acceptance and in increasing the concentration of BCFA, fatty acids beneficial to human health, without changing the physical-chemical parameters in lamb meat, the absolute changes are considered to have been quite small, which was probably related to the high efficiency of PUFA biohydrogenation in the n the rumen.

Keywords: composition, health, antioxidant, meat analysis

Procedia PDF Downloads 102
24158 An Effective Route to Control of the Safety of Accessing and Storing Data in the Cloud-Based Data Base

Authors: Omid Khodabakhshi, Amir Rozdel

Abstract:

The subject of cloud computing security research has allocated a number of challenges and competitions because the data center is comprised of complex private information and are always faced various risks of information disclosure by hacker attacks or internal enemies. Accordingly, the security of virtual machines in the cloud computing infrastructure layer is very important. So far, there are many software solutions to develop security in virtual machines. But using software alone is not enough to solve security problems. The purpose of this article is to examine the challenges and security requirements for accessing and storing data in an insecure cloud environment. In other words, in this article, a structure is proposed for the implementation of highly isolated security-sensitive codes using secure computing hardware in virtual environments. It also allows remote code validation with inputs and outputs. We provide these security features even in situations where the BIOS, the operating system, and even the super-supervisor are infected. To achieve these goals, we will use the hardware support provided by the new Intel and AMD processors, as well as the TPM security chip. In conclusion, the use of these technologies ultimately creates a root of dynamic trust and reduces TCB to security-sensitive codes.

Keywords: code, cloud computing, security, virtual machines

Procedia PDF Downloads 178
24157 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran

Authors: Rojin Bana Derakhshan, Abbas Toloie

Abstract:

For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.

Keywords: energy saving, key elements of success, optimization of energy consumption, data mining

Procedia PDF Downloads 450
24156 Analyzing the Evolution of Adverse Events in Pharmacovigilance: A Data-Driven Approach

Authors: Kwaku Damoah

Abstract:

This study presents a comprehensive data-driven analysis to understand the evolution of adverse events (AEs) in pharmacovigilance. Utilizing data from the FDA Adverse Event Reporting System (FAERS), we employed three analytical methods: rank-based, frequency-based, and percentage change analyses. These methods assessed temporal trends and patterns in AE reporting, focusing on various drug-active ingredients and patient demographics. Our findings reveal significant trends in AE occurrences, with both increasing and decreasing patterns from 2000 to 2023. This research highlights the importance of continuous monitoring and advanced analysis in pharmacovigilance, offering valuable insights for healthcare professionals and policymakers to enhance drug safety.

Keywords: event analysis, FDA adverse event reporting system, pharmacovigilance, temporal trend analysis

Procedia PDF Downloads 36
24155 Correlates of Income Generation of Small-Scale Fish Processors in Abeokuta Metropolis, Ogun State, Nigeria

Authors: Ayodeji Motunrayo Omoare

Abstract:

Economically fish provides an important source of food and income for both men and women especially many households in the developing world and fishing has an important social and cultural position in river-rine communities. However, fish is highly susceptible to deterioration. Consequently, this study was carried out to correlate income generation of small-scale women fish processors in Abeokuta metropolis, Ogun State, Nigeria. Eighty small-scale women fish processors were randomly selected from five communities as the sample size for this study. Collected data were analyzed using both descriptive and inferential statistics. The results showed that the mean age of the respondents was 31.75 years with average household size of 4 people while 47.5% of the respondents had primary education. Most (86.3%) of the respondents were married and had spent more than 11 years in fish processing. The respondents were predominantly Yoruba tribe (91.2%). Majority (71.3%) of the respondents used traditional kiln for processing their fish while 23.7% of the respondents used hot vegetable oil to fry their fish. Also, the result revealed that respondents sourced capital from Personal Savings (48.8%), Cooperatives (27.5%), Friends and Family (17.5%) and Microfinance Banks (6.2%) for fish processing activities. The respondents generated an average income of ₦7,000.00 from roasted fish, ₦3,500.00 from dried fish, and ₦5,200.00 from fried fish daily. However, inadequate processing equipment (95.0%), non-availability of credit facility from microfinance banks (85.0%), poor electricity supply (77.5%), inadequate extension service support (70.0%), and fuel scarcity (68.7%) were major constraints to fish processing in the study area. Results of chi-square analysis showed that there was a significant relationship between personal characteristics (χ2 = 36.83, df = 9), processing methods (χ2 = 15.88, df = 3) and income generated at p < 0.05 level of significance. It can be concluded that significant relationship existed between processing methods and income generated. The study, therefore, recommends that modern processing equipment should be made available to the respondents at a subsidized price by the agro-allied companies.

Keywords: correlates, income, fish processors, women, small-scale

Procedia PDF Downloads 229
24154 Agglomerative Hierarchical Clustering Using the Tθ Family of Similarity Measures

Authors: Salima Kouici, Abdelkader Khelladi

Abstract:

In this work, we begin with the presentation of the Tθ family of usual similarity measures concerning multidimensional binary data. Subsequently, some properties of these measures are proposed. Finally, the impact of the use of different inter-elements measures on the results of the Agglomerative Hierarchical Clustering Methods is studied.

Keywords: binary data, similarity measure, Tθ measures, agglomerative hierarchical clustering

Procedia PDF Downloads 467
24153 A Perspective of Digital Formation in the Solar Community as a Prototype for Finding Sustainable Algorithmic Conditions on Earth

Authors: Kunihisa Kakumoto

Abstract:

“Purpose”: Global environmental issues are now being raised in a global dimension. By predicting sprawl phenomena beyond the limits of nature with algorithms, we can expect to protect our social life within the limits of nature. It turns out that the sustainable state of the planet now consists in maintaining a balance between the capabilities of nature and the possibilities of our social life. The amount of water on earth is finite. Sustainability is therefore highly dependent on water capacity. A certain amount of water is stored in the forest by planting and green space, and the amount of water can be considered in relation to the green space. CO2 is also absorbed by green plants. "Possible measurements and methods": The concept of the solar community has been introduced in technical papers on the occasion of many international conferences. The solar community concept is based on data collected from one solar model house. This algorithmic study simulates the amount of water stored by lush green vegetation. In addition, we calculated and compared the amount of CO2 emissions from the Taiyo Community and the amount of CO2 reduction from greening. Based on the trial calculation results of these solar communities, we are simulating the sustainable state of the earth as an algorithm trial calculation result. We believe that we should also consider the composition of this solar community group using digital technology as control technology. "Conclusion": We consider the solar community as a prototype for finding sustainable conditions for the planet. The role of water is very important as the supply capacity of water is limited. However, the circulation of social life is not constructed according to the mechanism of nature. This simulation trial calculation is explained using the total water supply volume as an example. According to this process, algorithmic calculations consider the total capacity of the water supply and the population and habitable numbers of the area. Green vegetated land is very important to keep enough water. Green vegetation is also very important to maintain CO2 balance. A simulation trial calculation is possible from the relationship between the CO2 emissions of the solar community and the amount of CO2 reduction due to greening. In order to find this total balance and sustainable conditions, the algorithmic simulation calculation takes into account lush vegetation and total water supply. Research to find sustainable conditions is done by simulating an algorithmic model of the solar community as a prototype. In this one prototype example, it's balanced. The activities of our social life must take place within the permissive limits of natural mechanisms. Of course, we aim for a more ideal balance by utilizing auxiliary digital control technology such as AI.

Keywords: solar community, sustainability, prototype, algorithmic simulation

Procedia PDF Downloads 46
24152 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies

Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal

Abstract:

Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.

Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model

Procedia PDF Downloads 205
24151 Recovery of Hydrogen Converter Efficiency Affected by Poisoning of Catalyst with Increasing of Temperature

Authors: Enayat Enayati, Reza Behtash

Abstract:

The purpose of the H2 removal system is to reduce a content of hydrogen and other combustibles in the CO2 feed owing to avoid developing a possible explosive condition in the synthesis. In order to reduce the possibility of forming an explosive gas mixture in the synthesis as much as possible, the hydrogen percent in the fresh CO2, will be removed in hydrogen converter. Therefore the partly compressed CO2/Air mixture is led through Hydrogen converter (Reactor) where the H2, present in the CO2, is reduced by catalytic combustion to values less than 50 ppm (vol). According the following exothermic chemical reaction: 2H2 + O2 → 2H2O + Heat. The catalyst in hydrogen converter consist of platinum on a aluminum oxide carrier. Low catalyst activity maybe due to catalyst poisoning. This will result in an increase of the hydrogen content in the CO2 to the synthesis. It is advised to shut down the plant when the outlet of hydrogen converter increased above 100 ppm, to prevent undesirable gas composition in the plant. Replacement of catalyst will be time exhausting and costly so as to prevent this, we increase the inlet temperature of hydrogen converter according to following Arrhenius' equation: K=K0e (-E_a/RT) K is rate constant of a chemical reaction where K0 is the pre-exponential factor, E_a is the activation energy, and R is the universal gas constant. Increment of inlet temperature of hydrogen converter caused to increase the rate constant of chemical reaction and so declining the amount of hydrogen from 125 ppm to 70 ppm.

Keywords: catalyst, converter, poisoning, temperature

Procedia PDF Downloads 799
24150 Spatial-Temporal Clustering Characteristics of Dengue in the Northern Region of Sri Lanka, 2010-2013

Authors: Sumiko Anno, Keiji Imaoka, Takeo Tadono, Tamotsu Igarashi, Subramaniam Sivaganesh, Selvam Kannathasan, Vaithehi Kumaran, Sinnathamby Noble Surendran

Abstract:

Dengue outbreaks are affected by biological, ecological, socio-economic and demographic factors that vary over time and space. These factors have been examined separately and still require systematic clarification. The present study aimed to investigate the spatial-temporal clustering relationships between these factors and dengue outbreaks in the northern region of Sri Lanka. Remote sensing (RS) data gathered from a plurality of satellites were used to develop an index comprising rainfall, humidity and temperature data. RS data gathered by ALOS/AVNIR-2 were used to detect urbanization, and a digital land cover map was used to extract land cover information. Other data on relevant factors and dengue outbreaks were collected through institutions and extant databases. The analyzed RS data and databases were integrated into geographic information systems, enabling temporal analysis, spatial statistical analysis and space-time clustering analysis. Our present results showed that increases in the number of the combination of ecological factor and socio-economic and demographic factors with above the average or the presence contribute to significantly high rates of space-time dengue clusters.

Keywords: ALOS/AVNIR-2, dengue, space-time clustering analysis, Sri Lanka

Procedia PDF Downloads 463
24149 Statistical Inferences for GQARCH-It\^{o} - Jumps Model Based on The Realized Range Volatility

Authors: Fu Jinyu, Lin Jinguan

Abstract:

This paper introduces a novel approach that unifies two types of models: one is the continuous-time jump-diffusion used to model high-frequency data, and the other is discrete-time GQARCH employed to model low-frequency financial data by embedding the discrete GQARCH structure with jumps in the instantaneous volatility process. This model is named “GQARCH-It\^{o} -Jumps mode.” We adopt the realized range-based threshold estimation for high-frequency financial data rather than the realized return-based volatility estimators, which entail the loss of intra-day information of the price movement. Meanwhile, a quasi-likelihood function for the low-frequency GQARCH structure with jumps is developed for the parametric estimate. The asymptotic theories are mainly established for the proposed estimators in the case of finite activity jumps. Moreover, simulation studies are implemented to check the finite sample performance of the proposed methodology. Specifically, it is demonstrated that how our proposed approaches can be practically used on some financial data.

Keywords: It\^{o} process, GQARCH, leverage effects, threshold, realized range-based volatility estimator, quasi-maximum likelihood estimate

Procedia PDF Downloads 140
24148 Exchanging Radiology Reporting System with Electronic Health Record: Designing a Conceptual Model

Authors: Azadeh Bashiri

Abstract:

Introduction: In order to better designing of electronic health record system in Iran, integration of health information systems based on a common language must be done to interpret and exchange this information with this system is required. Background: This study, provides a conceptual model of radiology reporting system using unified modeling language. The proposed model can solve the problem of integration this information system with electronic health record system. By using this model and design its service based, easily connect to electronic health record in Iran and facilitate transfer radiology report data. Methods: This is a cross-sectional study that was conducted in 2013. The student community was 22 experts that working at the Imaging Center in Imam Khomeini Hospital in Tehran and the sample was accorded with the community. Research tool was a questionnaire that prepared by the researcher to determine the information requirements. Content validity and test-retest method was used to measure validity and reliability of questioner respectively. Data analyzed with average index, using SPSS. Also, Visual Paradigm software was used to design a conceptual model. Result: Based on the requirements assessment of experts and related texts, administrative, demographic and clinical data and radiological examination results and if the anesthesia procedure performed, anesthesia data suggested as minimum data set for radiology report and based it class diagram designed. Also by identifying radiology reporting system process, use case was drawn. Conclusion: According to the application of radiology reports in electronic health record system for diagnosing and managing of clinical problem of the patient, provide the conceptual Model for radiology reporting system; in order to systematically design it, the problem of data sharing between these systems and electronic health records system would eliminate.

Keywords: structured radiology report, information needs, minimum data set, electronic health record system in Iran

Procedia PDF Downloads 239
24147 The Potential of Potato and Maize Based Snacks as Fire Accelerants

Authors: E. Duffin, L. Brownlow

Abstract:

Arson is a crime which can provide exceptional problems to forensic specialists. Its destructive nature makes evidence much harder to find, especially when used to cover up another crime. There is a consistent potential threat of arsonists seeking new and easier ways to set fires. Existing research in this field primarily focuses on the use of accelerants such as petrol, with less attention to other more accessible and harder to detect materials. This includes the growing speculation of potato and maize-based snacks being used as fire accelerants. It was hypothesized that all ‘crisp-type’ snacks in foil packaging had the potential to act as accelerants and would burn readily in the various experiments. To test this hypothesis, a series of small lab-based experiments were undertaken, igniting samples of the snacks. Factors such as ingredients, shape, packaging and calorific value were all taken into consideration. The time (in seconds) spent on fire by the individual snacks was recorded. It was found that all of the snacks tested burnt for statistically similar amounts of time with a p-value of 0.0157. This was followed with a large mock real-life scenario using packets of crisps on fire and car seats to investigate as to the possibility of these snacks being verifiable tools to the arsonist. Here, three full packets of crisps were selected based on variations in burning during the lab experiments. They were each lit with a lighter to initiate burning, then placed onto a car seat to be timed and observed with video cameras. In all three cases, the fire was significant and sustained by the 200-second mark. On the basis of this data, it was concluded that potato and maize-based snacks were viable accelerants of fire. They remain an effective method of starting fires whilst being cheap, accessible, non-suspicious and non-detectable. The results produced supported the hypothesis that all ‘crisp-type’ snacks in foil packaging (that had been tested) had the potential to act as accelerants and would burn readily in the various experiments. This study serves to raise awareness and provide a basis for research and prevention of arson regarding maize and potato-based snacks as fire accelerants.

Keywords: arson, crisps, fires, food

Procedia PDF Downloads 111
24146 Ecorium: The Ecological Project in Montevideo Uruguay

Authors: Chettou Souhaila, Soufi Omar, Roumia Mohammed Ammar

Abstract:

Protecting the environment is to preserve the survival and future of humanity. Indeed, the environment is our source of food and drinking water, the air is our source of oxygen, the climate allows our survival and biodiversity are a potential drug reservoir. Preserving the environment is, therefore, a matter of survival. The objective of this project is to familiarize the general public with environmental problems not only with the theme of environmental protection, but also with the concept of biodiversity in different ecosystems. For it, the aim of our project was to create the Ecorium which is a place that preserves many species of plants of different ecosystems, schools, malls, buildings, offices, ecological transports, gardens, and many familial activities that participated in the ecosystems development, strategic biodiversity and sustainable development.

Keywords: ecological system, ecorium, environment, sustainable development

Procedia PDF Downloads 316
24145 Posttranslational Modifications of Histone H3 in Tumor Tissue Isolated from Silver and Gold Nanoparticles Treated Mice

Authors: Lucyna Kapka-Skrzypczak, Barbara Sochanowicz, Magdalena Matysiak-Kucharek, Magdalena Czajka, Krzysztof Sawicki, Marcin Kruszewski

Abstract:

Due to the strong antimicrobial activity silver nanoparticles (AgNPs) are widely used in various medical and general applications, among others, in cosmetics, odour resistant textiles, etc. The aim of this study was to compare effect of AgNPs and gold NPs (AuNPs) on histones posttranslational modifications. Histone molecule posttranscriptional modifications are responsible for chromatin compaction and repackaging. In this study, BALB/c mice were inoculated with murine mammary carcinoma 4T1 cells and treated with AgNPs coated with citrate (AgNPs(cit) or PEG (AgNPs(PEG), or AuNPs. Thereafter the histone H3 acetylation on Lys9 and H3 methylation on Lys4, Lys9, Lys29 was investigated. All NPs tested decreased H3 methylation, while no effect was observed for H3 acetylation. Modification of histone H3 methylation dependent on type of NPs used its coating, site of methylation and treatment used. Conclusion, epigenetic effects of nanomaterials depend on nanomaterial composition, its coating, and way of application. This work was supported by National Science Centre grant No. 2014/15/B/NZ7/01036 (MK, LKS, MMK, MC, KS), statutory funding for INTC (BS).

Keywords: gold nanoparticles, histone, methylation, silver nanoparticles

Procedia PDF Downloads 180
24144 Performance Analysis of Geophysical Database Referenced Navigation: The Combination of Gravity Gradient and Terrain Using Extended Kalman Filter

Authors: Jisun Lee, Jay Hyoun Kwon

Abstract:

As an alternative way to compensate the INS (inertial navigation system) error in non-GNSS (Global Navigation Satellite System) environment, geophysical database referenced navigation is being studied. In this study, both gravity gradient and terrain data were combined to complement the weakness of sole geophysical data as well as to improve the stability of the positioning. The main process to compensate the INS error using geophysical database was constructed on the basis of the EKF (Extended Kalman Filter). In detail, two type of combination method, centralized and decentralized filter, were applied to check the pros and cons of its algorithm and to find more robust results. The performance of each navigation algorithm was evaluated based on the simulation by supposing that the aircraft flies with precise geophysical DB and sensors above nine different trajectories. Especially, the results were compared to the ones from sole geophysical database referenced navigation to check the improvement due to a combination of the heterogeneous geophysical database. It was found that the overall navigation performance was improved, but not all trajectories generated better navigation result by the combination of gravity gradient with terrain data. Also, it was found that the centralized filter generally showed more stable results. It is because that the way to allocate the weight for the decentralized filter could not be optimized due to the local inconsistency of geophysical data. In the future, switching of geophysical data or combining different navigation algorithm are necessary to obtain more robust navigation results.

Keywords: Extended Kalman Filter, geophysical database referenced navigation, gravity gradient, terrain

Procedia PDF Downloads 329
24143 An Application of Remote Sensing for Modeling Local Warming Trend

Authors: Khan R. Rahaman, Quazi K. Hassan

Abstract:

Global changes in climate, environment, economies, populations, governments, institutions, and cultures converge in localities. Changes at a local scale, in turn, contribute to global changes as well as being affected by them. Our hypothesis is built on a consideration that temperature does vary at local level (i.e., termed as local warming) in comparison to the predicted models at the regional and/or global scale. To date, the bulk of the research relating local places to global climate change has been top-down, from the global toward the local, concentrating on methods of impact analysis that use as a starting point climate change scenarios derived from global models, even though these have little regional or local specificity. Thus, our focus is to understand such trends over the southern Alberta, which will enable decision makers, scientists, researcher community, and local people to adapt their policies based on local level temperature variations and to act accordingly. Specific objectives in this study are: (i) to understand the local warming (temperature in particular) trend in context of temperature normal during the period 1961-2010 at point locations using meteorological data; (ii) to validate the data by using specific yearly data, and (iii) to delineate the spatial extent of the local warming trends and understanding influential factors to adopt situation by local governments. Existing data has brought the evidence of such changes and future research emphasis will be given to validate this hypothesis based on remotely sensed data (i.e. MODIS product by NASA).

Keywords: local warming, climate change, urban area, Alberta, Canada

Procedia PDF Downloads 321
24142 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 267
24141 Scalable Learning of Tree-Based Models on Sparsely Representable Data

Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou

Abstract:

Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.

Keywords: big data, sparsely representable data, tree-based models, scalable learning

Procedia PDF Downloads 247
24140 Polyvinyl Alcohol Incorporated with Hibiscus Extract Microcapsules as Combined Active and Intelligent Composite Film for Meat Preservation: Antimicrobial, Antioxidant, and Physicochemical Investigations

Authors: Ahmed F. Ghanem, Marwa I. Wahba, Asmaa N. El-Dein, Mohamed A. EL-Raey, Ghada E. A. Awad

Abstract:

Numerous attempts are being performed in order to formulate suitable packaging materials for the meat products. However, to the best of our knowledge, the incorporation of the free hibiscus extract or its microcapsules in the pure polyvinyl alcohol (PVA) matrix as packaging materials for the meats is seldom reported. Therefore, this study aims at the protection of the aqueous crude extract of the hibiscus flowers utilizing the spry drying encapsulation technique. Results of the Fourier transform infrared (FTIR), the scanning electron microscope (SEM), and the particle size analyzer confirmed the successful formation of the assembled capsules via strong interactions, the spherical rough microparticles, and the particle size of ~ 235 nm, respectively. Also, the obtained microcapsules enjoy higher thermal stability than the free extract. Then, the obtained spray-dried particles were incorporated into the casting solution of the pure PVA film with a concentration of 10 wt. %. The segregated free-standing composite films were investigated, compared to the neat matrix, with several characterization techniques such as FTIR, SEM, thermal gravimetric analysis (TGA), mechanical tester, contact angle, water vapor permeability, and oxygen transmission. The results demonstrated variations in the physicochemical properties of the PVA film after the inclusion of the free and the extract microcapsules. Moreover, biological studies emphasized the biocidal potential of the hybrid films against the microorganisms contaminating the meat. Specifically, the microcapsules imparted not only antimicrobial but also antioxidant activities to the PVA matrix. Application of the prepared films on the real meat samples displayed a low bacterial growth with a slight increase in the pH over the storage time which continued up to 10 days at 4 oC, as further evidence to the meat safety. Moreover, the colors of the films did not significantly changed except after 21 days indicating the spoilage of the meat samples. No doubt, the dual-functional of the prepared composite films pave the way towards combined active and smart food packaging applications. This would play a vital role in the food hygiene, including also the quality control and the assurance.

Keywords: PVA, hibiscus, extraction, encapsulation, active packaging, smart and intelligent packaging, meat spoilage

Procedia PDF Downloads 77
24139 On Estimating the Low Income Proportion with Several Auxiliary Variables

Authors: Juan F. Muñoz-Rosas, Rosa M. García-Fernández, Encarnación Álvarez-Verdejo, Pablo J. Moya-Fernández

Abstract:

Poverty measurement is a very important topic in many studies in social sciences. One of the most important indicators when measuring poverty is the low income proportion. This indicator gives the proportion of people of a population classified as poor. This indicator is generally unknown, and for this reason, it is estimated by using survey data, which are obtained by official surveys carried out by many statistical agencies such as Eurostat. The main feature of the mentioned survey data is the fact that they contain several variables. The variable used to estimate the low income proportion is called as the variable of interest. The survey data may contain several additional variables, also named as the auxiliary variables, related to the variable of interest, and if this is the situation, they could be used to improve the estimation of the low income proportion. In this paper, we use Monte Carlo simulation studies to analyze numerically the performance of estimators based on several auxiliary variables. In this simulation study, we considered real data sets obtained from the 2011 European Union Survey on Income and Living Condition. Results derived from this study indicate that the estimators based on auxiliary variables are more accurate than the naive estimator.

Keywords: inclusion probability, poverty, poverty line, survey sampling

Procedia PDF Downloads 438
24138 TessPy – Spatial Tessellation Made Easy

Authors: Jonas Hamann, Siavash Saki, Tobias Hagen

Abstract:

Discretization of urban areas is a crucial aspect in many spatial analyses. The process of discretization of space into subspaces without overlaps and gaps is called tessellation. It helps understanding spatial space and provides a framework for analyzing geospatial data. Tessellation methods can be divided into two groups: regular tessellations and irregular tessellations. While regular tessellation methods, like squares-grids or hexagons-grids, are suitable for addressing pure geometry problems, they cannot take the unique characteristics of different subareas into account. However, irregular tessellation methods allow the border between the subareas to be defined more realistically based on urban features like a road network or Points of Interest (POI). Even though Python is one of the most used programming languages when it comes to spatial analysis, there is currently no library that combines different tessellation methods to enable users and researchers to compare different techniques. To close this gap, we are proposing TessPy, an open-source Python package, which combines all above-mentioned tessellation methods and makes them easily accessible to everyone. The core functions of TessPy represent the five different tessellation methods: squares, hexagons, adaptive squares, Voronoi polygons, and city blocks. By using regular methods, users can set the resolution of the tessellation which defines the finesse of the discretization and the desired number of tiles. Irregular tessellation methods allow users to define which spatial data to consider (e.g., amenity, building, office) and how fine the tessellation should be. The spatial data used is open-source and provided by OpenStreetMap. This data can be easily extracted and used for further analyses. Besides the methodology of the different techniques, the state-of-the-art, including examples and future work, will be discussed. All dependencies can be installed using conda or pip; however, the former is more recommended.

Keywords: geospatial data science, geospatial data analysis, tessellations, urban studies

Procedia PDF Downloads 112
24137 A CFD Analysis of Hydraulic Characteristics of the Rod Bundles in the BREST-OD-300 Wire-Spaced Fuel Assemblies

Authors: Dmitry V. Fomichev, Vladimir V. Solonin

Abstract:

This paper presents the findings from a numerical simulation of the flow in 37-rod fuel assembly models spaced by a double-wire trapezoidal wrapping as applied to the BREST-OD-300 experimental nuclear reactor. Data on a high static pressure distribution within the models, and equations for determining the fuel bundle flow friction factors have been obtained. Recommendations are provided on using the closing turbulence models available in the ANSYS Fluent. A comparative analysis has been performed against the existing empirical equations for determining the flow friction factors. The calculated and experimental data fit has been shown. An analysis into the experimental data and results of the numerical simulation of the BREST-OD-300 fuel rod assembly hydrodynamic performance are presented.

Keywords: BREST-OD-300, ware-spaces, fuel assembly, computation fluid dynamics

Procedia PDF Downloads 365
24136 A Simulation Study for Potential Natural Gas Liquids Recovery Processes under Various Upstream Conditions

Authors: Mesfin Getu Woldetensay

Abstract:

Representatives and commercially viable natural gas liquids (NGLs) recovery processes were studied under various feed conditions that are classified as lean and rich. The conventional turbo- expander process scheme (ISS) was taken as a base case. The performance of this scheme was compared against with the gas sub-cooled process (GSP), cold residue-gas (CRR) and recycle split-vapor (RSV), enhanced NGL recovery process (IPSI-1) and enhanced NGL recovery process with internal refrigeration (IPSI-2). The development made for the GSP, CRR and RSV are at the top section of the demethanizer column whereas the IPSI-1 and IPSI-2 improvement focus in the lower section. HYSYS process flowsheet was initially developed for all the processes including the ISS under a common criteria that could help to demonstrate the performance comparison. Accordingly, a number of simulation runs were made for the selected eight types of feed. Results show that the reboiler duty requirement using rich feeds for GSP, CRR and RSV is quite high compared to IPSI-1 and IPSI-2. The latter shows relatively lower duty due to the presence of self-refrigeration system that allows the inlet feed to be used for achieving cooling without the need to use propane refrigerant. The energy consumption for lean feed is much lower than that of the rich feed in all process schemes.

Keywords: composition, lean, rich, duty

Procedia PDF Downloads 201
24135 The Influence of Organic Waste on Vegetable Nutritional Components and Healthy Livelihood, Minna, Niger State, Nigeria

Authors: A. Abdulkadir, A. A. Okhimamhe, Y. M. Bello, H. Ibrahim, D. H. Makun, M. T. Usman

Abstract:

Household waste form a larger proportion of waste generated across the state, accumulation of organic waste is an apparent problem and the existing dump sites could be overstressed. Niger state has abundant arable land and water resources thus should be one of the highest producers of agricultural crops in the country. However, the major challenge to agricultural sector today is the loss of soil nutrient coupled with high cost of fertilizer. These have continued to increase the use of fertilizer and decomposed solid waste for enhancing agricultural yield, which have varying effects on the soil as well a threat to human livelihood. Consequently, vegetable yield samples from poultry droppings decomposed household waste manure, NPK treatments and control from each replication were subjected to proximate analysis to determine the nutritional and anti-nutritional component as well as heavy metal concentration. Data collected was analyzed using SPSS software and Randomized complete Block Design means were compared. The result shows that the treatments do not devoid the concentrations of any nutritional components while the anti-nutritional analysis proved that NPK had higher oxalate content than control and organic treats. The concentration of lead and cadmium are within safe permissible level while the mercury level exceeded the FAO/WHO maximum permissible limit for the entire treatments depicts the need for urgent intervention to minimize mercury levels in soil and manure in order to mitigate its toxic effect. Thus, eco-agriculture should be widely accepted and promoted by the stakeholders for soil amendment, higher yield, strategies for sustainable environmental protection, food security, poverty eradication, attainment of sustainable development and healthy livelihood.

Keywords: anti-nutritional, healthy livelihood, nutritional waste, organic waste

Procedia PDF Downloads 367
24134 Analysis of Lead Time Delays in Supply Chain: A Case Study

Authors: Abdel-Aziz M. Mohamed, Nermeen Coutry

Abstract:

Lead time is an important measure of supply chain performance. It impacts both customer satisfactions as well as the total cost of inventory. This paper presents the result of a study on the analysis of the customer order lead-time for a multinational company. In the study, the lead time was divided into three stages: order entry, order fulfillment, and order delivery. A sample of size 2,425 order lines from the company records were considered for this study. The sample data includes information regarding customer orders from the time of order entry until order delivery. Data regarding the lead time of each sage for different orders were also provided. Summary statistics on lead time data reveals that about 30% of the orders were delivered after the scheduled due date. The result of the multiple linear regression analysis technique revealed that component type, logistics parameter, order size and the customer type have significant impact on lead time. Data analysis on the stages of lead time indicates that stage 2 consumes over 50% of the lead time. Pareto analysis was made to study the reasons for the customer order delay in each of the 3 stages. Recommendation was given to resolve the problem.

Keywords: lead time reduction, customer satisfaction, service quality, statistical analysis

Procedia PDF Downloads 710
24133 A Unified Approach for Digital Forensics Analysis

Authors: Ali Alshumrani, Nathan Clarke, Bogdan Ghite, Stavros Shiaeles

Abstract:

Digital forensics has become an essential tool in the investigation of cyber and computer-assisted crime. Arguably, given the prevalence of technology and the subsequent digital footprints that exist, it could have a significant role across almost all crimes. However, the variety of technology platforms (such as computers, mobiles, Closed-Circuit Television (CCTV), Internet of Things (IoT), databases, drones, cloud computing services), heterogeneity and volume of data, forensic tool capability, and the investigative cost make investigations both technically challenging and prohibitively expensive. Forensic tools also tend to be siloed into specific technologies, e.g., File System Forensic Analysis Tools (FS-FAT) and Network Forensic Analysis Tools (N-FAT), and a good deal of data sources has little to no specialist forensic tools. Increasingly it also becomes essential to compare and correlate evidence across data sources and to do so in an efficient and effective manner enabling an investigator to answer high-level questions of the data in a timely manner without having to trawl through data and perform the correlation manually. This paper proposes a Unified Forensic Analysis Tool (U-FAT), which aims to establish a common language for electronic information and permit multi-source forensic analysis. Core to this approach is the identification and development of forensic analyses that automate complex data correlations, enabling investigators to investigate cases more efficiently. The paper presents a systematic analysis of major crime categories and identifies what forensic analyses could be used. For example, in a child abduction, an investigation team might have evidence from a range of sources including computing devices (mobile phone, PC), CCTV (potentially a large number), ISP records, and mobile network cell tower data, in addition to third party databases such as the National Sex Offender registry and tax records, with the desire to auto-correlate and across sources and visualize in a cognitively effective manner. U-FAT provides a holistic, flexible, and extensible approach to providing digital forensics in technology, application, and data-agnostic manner, providing powerful and automated forensic analysis.

Keywords: digital forensics, evidence correlation, heterogeneous data, forensics tool

Procedia PDF Downloads 177