Search results for: large amounts of data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29260

Search results for: large amounts of data

29050 End to End Monitoring in Oracle Fusion Middleware for Data Verification

Authors: Syed Kashif Ali, Usman Javaid, Abdullah Chohan

Abstract:

In large enterprises multiple departments use different sort of information systems and databases according to their needs. These systems are independent and heterogeneous in nature and sharing information/data between these systems is not an easy task. The usage of middleware technologies have made data sharing between systems very easy. However, monitoring the exchange of data/information for verification purposes between target and source systems is often complex or impossible for maintenance department due to security/access privileges on target and source systems. In this paper, we are intended to present our experience of an end to end data monitoring approach at middle ware level implemented in Oracle BPEL for data verification without any help of monitoring tool.

Keywords: service level agreement, SOA, BPEL, oracle fusion middleware, web service monitoring

Procedia PDF Downloads 454
29049 Investigation of Parameters Affecting Copper Recovery from Brass Melting Dross

Authors: Sercan Basit, Muhlis N. Sarıdede

Abstract:

Metal amounts of copper based compounds in the various wastes have been recovered successfully by hydrometallurgical treatment methods in the literature. X-ray diffraction pattern of the brass melting slag demonstrates that it contains sufficient amount of recoverable copper. Recovery of copper from brass melting dross by sulfuric acid leaching and the effect of temperature and acid and oxidant concentration on recovery rate of copper have been investigated in this study. Experiments were performed in a temperature-controlled reactor in sulfuric acid solution in different molarities using solid liquid ratio of 100 g/L, with leaching time of 300 min. Temperature was changed between 25 °C and 80 °C and molarity was between 0.5 and 3M. The results obtained showed that temperature has important positive effect on recovery whereas it decreases with time. Also copper was recovered in larger amounts from brass dross in the presence of H2O2 as an oxidant according to the case that oxidant was not used.

Keywords: brass dross, copper recovery, hydrogen peroxide, leaching

Procedia PDF Downloads 299
29048 Friend or Foe: Decoding the Legal Challenges Posed by Artificial Intellegence in the Era of Intellectual Property

Authors: Latika Choudhary

Abstract:

“The potential benefits of Artificial Intelligence are huge, So are the dangers.” - Dave Water. Artificial intelligence is one of the facet of Information technology domain which despite several attempts does not have a clear definition or ambit. However it can be understood as technology to solve problems via automated decisions and predictions. Artificial intelligence is essentially an algorithm based technology which analyses the large amounts of data and then solves problems by detecting useful patterns. Owing to its automated feature it will not be wrong to say that humans & AI have more utility than humans alone or computers alone.1 For many decades AI experienced enthusiasm as well as setbacks, yet it has today become part and parcel of our everyday life, making it convenient or at times problematic. AI and related technology encompass Intellectual Property in multiple ways, the most important being AI technology for management of Intellectual Property, IP for protecting AI and IP as a hindrance to the transparency of AI systems. Thus the relationship between the two is of reciprocity as IP influences AI and vice versa. While AI is a recent concept, the IP laws for protection or even dealing with its challenges are relatively older, raising the need for revision to keep up with the pace of technological advancements. This paper will analyze the relationship between AI and IP to determine how beneficial or conflictual the same is, address how the old concepts of IP are being stretched to its maximum limits so as to accommodate the unwanted consequences of the Artificial Intelligence and propose ways to mitigate the situation so that AI becomes the friend it is and not turn into a potential foe it appears to be.

Keywords: intellectual property rights, information technology, algorithm, artificial intelligence

Procedia PDF Downloads 61
29047 Exploring Teachers’ Beliefs about Diagnostic Language Assessment Practices in a Large-Scale Assessment Program

Authors: Oluwaseun Ijiwade, Chris Davison, Kelvin Gregory

Abstract:

In Australia, like other parts of the world, the debate on how to enhance teachers using assessment data to inform teaching and learning of English as an Additional Language (EAL, Australia) or English as a Foreign Language (EFL, United States) have occupied the centre of academic scholarship. Traditionally, this approach was conceptualised as ‘Formative Assessment’ and, in recent times, ‘Assessment for Learning (AfL)’. The central problem is that teacher-made tests are limited in providing data that can inform teaching and learning due to variability of classroom assessments, which are hindered by teachers’ characteristics and assessment literacy. To address this concern, scholars in language education and testing have proposed a uniformed large-scale computer-based assessment program to meet the needs of teachers and promote AfL in language education. In Australia, for instance, the Victoria state government commissioned a large-scale project called 'Tools to Enhance Assessment Literacy (TEAL) for Teachers of English as an additional language'. As part of the TEAL project, a tool called ‘Reading and Vocabulary assessment for English as an Additional Language (RVEAL)’, as a diagnostic language assessment (DLA), was developed by language experts at the University of New South Wales for teachers in Victorian schools to guide EAL pedagogy in the classroom. Therefore, this study aims to provide qualitative evidence for understanding beliefs about the diagnostic language assessment (DLA) among EAL teachers in primary and secondary schools in Victoria, Australia. To realize this goal, this study raises the following questions: (a) How do teachers use large-scale assessment data for diagnostic purposes? (b) What skills do language teachers think are necessary for using assessment data for instruction in the classroom? and (c) What factors, if any, contribute to teachers’ beliefs about diagnostic assessment in a large-scale assessment? Semi-structured interview method was used to collect data from at least 15 professional teachers who were selected through a purposeful sampling. The findings from the resulting data analysis (thematic analysis) provide an understanding of teachers’ beliefs about DLA in a classroom context and identify how these beliefs are crystallised in language teachers. The discussion shows how the findings can be used to inform professional development processes for language teachers as well as informing important factor of teacher cognition in the pedagogic processes of language assessment. This, hopefully, will help test developers and testing organisations to align the outcome of this study with their test development processes to design assessment that can enhance AfL in language education.

Keywords: beliefs, diagnostic language assessment, English as an additional language, teacher cognition

Procedia PDF Downloads 172
29046 The Challenges of Teaching First Year Accounting with a Lecturer-Student Ratio of 1:1248

Authors: Hanli Joubert

Abstract:

In South Africa, teaching large classes is a reality that lecturers face in most higher institutions. When teaching a large group, literature normally refers to groups of about 50 to 500 students. At the University of the Free State, the first-year accounting group comprises around 1300 students. Apart from extremely large classes, the problem is exacerbated by the diversity of students’ previous schooling in accounting as well as their socio-economic backgrounds. The university scenario is further complicated by a lack of venues, compressed timetables, as well as lack of resources. This study aims to investigate the challenges and effectiveness of teaching a large and diverse group of first-year accounting students by drawing from personal experience, a literature study, interviews with other lecturers as well as students registered for first year accounting. The results reveal that teaching first-year accounting students in a large group is not the ideal situation but that it can be effective if it is managed correctly.

Keywords: diverse backgrounds, large groups, limited resources, first-year accounting students

Procedia PDF Downloads 26
29045 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 386
29044 Learning Analytics in a HiFlex Learning Environment

Authors: Matthew Montebello

Abstract:

Student engagement within a virtual learning environment generates masses of data points that can significantly contribute to the learning analytics that lead to decision support. Ideally, similar data is collected during student interaction with a physical learning space, and as a consequence, data is present at a large scale, even in relatively small classes. In this paper, we report of such an occurrence during classes held in a HiFlex modality as we investigate the advantages of adopting such a methodology. We plan to take full advantage of the learner-generated data in an attempt to further enhance the effectiveness of the adopted learning environment. This could shed crucial light on operating modalities that higher education institutions around the world will switch to in a post-COVID era.

Keywords: HiFlex, big data in higher education, learning analytics, virtual learning environment

Procedia PDF Downloads 172
29043 Data, Digital Identity and Antitrust Law: An Exploratory Study of Facebook’s Novi Digital Wallet

Authors: Wanjiku Karanja

Abstract:

Facebook has monopoly power in the social networking market. It has grown and entrenched its monopoly power through the capture of its users’ data value chains. However, antitrust law’s consumer welfare roots have prevented it from effectively addressing the role of data capture in Facebook’s market dominance. These regulatory blind spots are augmented in Facebook’s proposed Diem cryptocurrency project and its Novi Digital wallet. Novi, which is Diem’s digital identity component, shall enable Facebook to collect an unprecedented volume of consumer data. Consequently, Novi has seismic implications on internet identity as the network effects of Facebook’s large user base could establish it as the de facto internet identity layer. Moreover, the large tracts of data Facebook shall collect through Novi shall further entrench Facebook's market power. As such, the attendant lock-in effects of this project shall be very difficult to reverse. Urgent regulatory action is therefore required to prevent this expansion of Facebook’s data resources and monopoly power. This research thus highlights the importance of data capture to competition and market health in the social networking industry. It utilizes interviews with key experts to empirically interrogate the impact of Facebook’s data capture and control of its users’ data value chains on its market power. This inquiry is contextualized against Novi’s expansive effect on Facebook’s data value chains. It thus addresses the novel antitrust issues arising at the nexus of Facebook’s monopoly power and the privacy of its users’ data. It also explores the impact of platform design principles, specifically data portability and data portability, in mitigating Facebook’s anti-competitive practices. As such, this study finds that Facebook is a powerful monopoly that dominates the social media industry to the detriment of potential competitors. Facebook derives its power from its size, annexure of the consumer data value chain, and control of its users’ social graphs. Additionally, the platform design principles of data interoperability and data portability are not a panacea to restoring competition in the social networking market. Their success depends on the establishment of robust technical standards and regulatory frameworks.

Keywords: antitrust law, data protection law, data portability, data interoperability, digital identity, Facebook

Procedia PDF Downloads 99
29042 Functionalized Carbon-Base Fluorescent Nanoparticles for Emerging Contaminants Targeted Analysis

Authors: Alexander Rodríguez-Hernández, Arnulfo Rojas-Perez, Liz Diaz-Vazquez

Abstract:

The rise in consumerism over the past century has resulted in the creation of higher amounts of plasticizers, personal care products and other chemical substances, which enter and accumulate in water systems. Other sources of pollutants in Neotropical regions experience large inputs of nutrients with these pollutants resulting in eutrophication of water which consume large quantities of oxygen, resulting in high fish mortality. This dilemma has created a need for the development of targeted detection in complex matrices and remediation of emerging contaminants. We have synthesized carbon nanoparticles from macro algae (Ulva fasciata) by oxidizing the graphitic carbon network under extreme acidic conditions. The resulting material was characterized by STEM, yielding a spherical 12 nm average diameter nanoparticles, which can be fixed into a polysaccharide aerogel synthesized from the same macro algae. Spectrophotometer analyses show a pH dependent fluorescent behavior varying from 450-620 nm in aqueous media. Heavily oxidized edges provide for easy functionalization with enzymes for a more targeted analysis and remediation technique. Given the optical properties of the carbon base nanoparticles and the numerous possibilities of functionalization, we have developed a selective and robust targeted bio-detection and bioremediation technique for the treatment of emerging contaminants in complex matrices like estuarine embayment.

Keywords: aerogels, carbon nanoparticles, fluorescent, targeted analysis

Procedia PDF Downloads 216
29041 Complete Enumeration Approach for Calculation of Residual Entropy for Diluted Spin Ice

Authors: Yuriy A. Shevchenko, Konstantin V. Nefedev

Abstract:

We consider the antiferromagnetic systems of Ising spins located at the sites of the hexagonal, triangular and pyrochlore lattices. Such systems can be diluted to a certain concentration level by randomly replacing the magnetic spins with nonmagnetic ones. Quite recently we studied density of states (DOS) was calculated by the Wang-Landau method. Based on the obtained data, we calculated the dependence of the residual entropy (entropy at a temperature tending to zero) on the dilution concentration for quite large systems (more than 2000 spins). In the current study, we obtained the same data for small systems (less than 20 spins) by a complete search of all possible magnetic configurations and compared the result with the result for large systems. The shape of the curve remains unchanged in both cases, but the specific values of the residual entropy are different because of the finite size effect.

Keywords: entropy, pyrochlore, spin ice, Wang-Landau algorithm

Procedia PDF Downloads 235
29040 Evaluation of the Potential of Olive Pomace Compost for Using as a Soil Amendment

Authors: M. Černe, I. Palčić, D. Anđelini, D. Cvitan, N. Major, M. Lukić, S. Goreta Ban, D. Ban, T. Rijavec, A. Lapanje

Abstract:

Context: In the Mediterranean basin, large quantities of lignocellulosic by-products, such as olive pomace (OP), are generated during olive processing on an annual basis. Due to the phytotoxic nature of OP, composting is recommended for its stabilisation to produce the end-product safe for agricultural use. Research Aim: This study aims to evaluate the applicability of olive pomace compost (OPC) for use as a soil amendment by considering its physical and chemical characteristics and microbiological parameters. Methodology: The OPC samples were collected from the surface and depth layers of the compost pile after 8 months. The samples were analyzed for their C/N, pH, EC, total phenolic content, residual oils, and elemental content, as well as colloidal properties and microbial community structure. The specific analytical approaches used are detailed in the poster. Findings: The results showed that the pH of OPC ranged from 7.8 to 8.6, while the electrical conductivity was from 770 to 1608 mS/cm. The levels of nitrogen (N), phosphorus (P), and potassium (K) varied within the ranges of 1.5 to 27.2 g/kg d.w., 1.6 to 1.8 g/kg d.w., and 6.5 to 7.5 g/kg d.w., respectively. The contents of potentially toxic metals such as chromium (Cr), copper (Cu), nickel (Ni), lead (Pb), and zinc (Zn) were below the EU limits for soil improvers. The microbial structure follows the changes of the gradient from the outer to the innermost layer with relatively low amounts of DNA. The gradient nature shows that it is needed to develop better strategies for composting surpassing the conventional approach. However, the low amounts of total phenols and oil residues indicated efficient biodegradation during composting. The carbon-to-nitrogen ratio (C/N) within the range of 13 to 16 suggested that OPC can be used as a soil amendment. Overall, the study suggests that composting can be a promising strategy for environmentally-friendly OP recycling. Theoretical Importance: This study contributes to the understanding of the use of OPC as a soil amendment and its potential benefits in resource recycling and reducing environmental burdens. It also highlights the need for improved composting strategies to optimize its process. Data Collection and Analysis Procedures: The OPC samples were taken from the compost pile and charasterised for selected chemical, physical and microbial parameters. The specific analytical procedures utilized are described in detail in the poster. Question Addressed: This study addresses the question of whether composting can be optimized to improve the biodegradation of OP. Conclusion: The study concludes that OPC has the potential to be used as a soil amendment due to its favorable physical and chemical characteristics, low levels of potentially toxic metals, and efficient biodegradation during composting. However, the results also suggest the need for improved composting strategies to improve the quality of OPC.

Keywords: olive pomace compost, waste valorisation, agricultural use, soil amendment

Procedia PDF Downloads 34
29039 Internal and External Overpressure Calculation for Vented Gas Explosion by Using a Combined Computational Fluid Dynamics Approach

Authors: Jingde Li, Hong Hao

Abstract:

Recent oil and gas accidents have reminded us the severe consequences of gas explosion on structure damage and financial loss. In order to protect the structures and personnel, engineers and researchers have been working on numerous different explosion mitigation methods. Amongst, venting is the most economical approach to mitigate gas explosion overpressure. In this paper, venting is used as the overpressure alleviation method. A theoretical method and a numerical technique are presented to predict the internal and external pressure from vented gas explosion in a large enclosure. Under idealized conditions, a number of experiments are used to calibrate the accuracy of the theoretically calculated data. A good agreement between the theoretical results and experimental data is seen. However, for realistic scenarios, the theoretical method over-estimates internal pressures and is incapable of predicting external pressures. Therefore, a CFD simulation procedure is proposed in this study to estimate both the internal and external overpressure from a large-scale vented explosion. Satisfactory agreement between CFD simulation results and experimental data is achieved.

Keywords: vented gas explosion, internal pressure, external pressure, CFD simulation, FLACS, ANSYS Fluent

Procedia PDF Downloads 136
29038 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 131
29037 Implementation of CNV-CH Algorithm Using Map-Reduce Approach

Authors: Aishik Deb, Rituparna Sinha

Abstract:

We have developed an algorithm to detect the abnormal segment/"structural variation in the genome across a number of samples. We have worked on simulated as well as real data from the BAM Files and have designed a segmentation algorithm where abnormal segments are detected. This algorithm aims to improve the accuracy and performance of the existing CNV-CH algorithm. The next-generation sequencing (NGS) approach is very fast and can generate large sequences in a reasonable time. So the huge volume of sequence information gives rise to the need for Big Data and parallel approaches of segmentation. Therefore, we have designed a map-reduce approach for the existing CNV-CH algorithm where a large amount of sequence data can be segmented and structural variations in the human genome can be detected. We have compared the efficiency of the traditional and map-reduce algorithms with respect to precision, sensitivity, and F-Score. The advantages of using our algorithm are that it is fast and has better accuracy. This algorithm can be applied to detect structural variations within a genome, which in turn can be used to detect various genetic disorders such as cancer, etc. The defects may be caused by new mutations or changes to the DNA and generally result in abnormally high or low base coverage and quantification values.

Keywords: cancer detection, convex hull segmentation, map reduce, next generation sequencing

Procedia PDF Downloads 103
29036 Prediction of Fire Growth of the Office by Real-Scale Fire Experiment

Authors: Kweon Oh-Sang, Kim Heung-Youl

Abstract:

Estimating the engineering properties of fires is important to be prepared for the complex and various fire risks of large-scale structures such as super-tall buildings, large stadiums, and multi-purpose structures. In this study, a mock-up of a compartment which was 2.4(L) x 3.6 (W) x 2.4 (H) meter in dimensions was fabricated at the 10MW LSC (Large Scale Calorimeter) and combustible office supplies were placed in the compartment for a real-scale fire test. Maximum heat release rate was 4.1 MW and total energy release obtained through the application of t2 fire growth rate was 6705.9 MJ.

Keywords: fire growth, fire experiment, t2 curve, large scale calorimeter

Procedia PDF Downloads 307
29035 Data Privacy: Stakeholders’ Conflicts in Medical Internet of Things

Authors: Benny Sand, Yotam Lurie, Shlomo Mark

Abstract:

Medical Internet of Things (MIoT), AI, and data privacy are linked forever in a gordian knot. This paper explores the conflicts of interests between the stakeholders regarding data privacy in the MIoT arena. While patients are at home during healthcare hospitalization, MIoT can play a significant role in improving the health of large parts of the population by providing medical teams with tools for collecting data, monitoring patients’ health parameters, and even enabling remote treatment. While the amount of data handled by MIoT devices grows exponentially, different stakeholders have conflicting understandings and concerns regarding this data. The findings of the research indicate that medical teams are not concerned by the violation of data privacy rights of the patients' in-home healthcare, while patients are more troubled and, in many cases, are unaware that their data is being used without their consent. MIoT technology is in its early phases, and hence a mixed qualitative and quantitative research approach will be used, which will include case studies and questionnaires in order to explore this issue and provide alternative solutions.

Keywords: MIoT, data privacy, stakeholders, home healthcare, information privacy, AI

Procedia PDF Downloads 76
29034 Design, Construction And Validation Of A Simple, Low-cost Phi Meter

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

The use of a phi meter allows for definition of equivalence ratio during a fire test. Previous phi meter designs have used expensive catalysts and had restricted portability due to the large furnace and requirement for pure oxygen. The new design of the phi meter did not require the use of a catalyst. The furnace design was based on the existing micro-scale combustion calorimetry (MCC) furnace and operating conditions based on the secondary oxidizer furnace used in the steady state tube furnace (SSTF). Preliminary tests were conducted to study the effects of varying furnace temperatures on combustion efficiency. The SSTF was chosen to validate the phi meter measurements as it can both pre-set and independently quantify the equivalence ratio during a test. The data were in agreement with the data obtained on the SSTF. It was also validated by a comparison of CO2 yields obtained from the SSTF oxidizer and those obtained by the phi meter. The phi meter designed and constructed in this work was proven to work effectively on a bench-scale. The phi meter was then used to measure the equivalence ratio on a series of large-scale ISO 9705 tests for numerous fire conditions. The materials used were a range of non-homogenous materials such as polyurethane. The measurements corresponded accurately to the data collected, showing the novel design can be used from bench to large-scale tests to measure equivalence ratio. This cheaper, more portable, safer and easier to use phi meter design will enable more widespread use and the ability to quantify fire conditions of tests, allowing for better understanding of flammability and smoke toxicity.

Keywords: phi meter, smoke toxicity, fire condition, ISO9705, novel equipment

Procedia PDF Downloads 78
29033 Emotion Mining and Attribute Selection for Actionable Recommendations to Improve Customer Satisfaction

Authors: Jaishree Ranganathan, Poonam Rajurkar, Angelina A. Tzacheva, Zbigniew W. Ras

Abstract:

In today’s world, business often depends on the customer feedback and reviews. Sentiment analysis helps identify and extract information about the sentiment or emotion of the of the topic or document. Attribute selection is a challenging problem, especially with large datasets in actionable pattern mining algorithms. Action Rule Mining is one of the methods to discover actionable patterns from data. Action Rules are rules that help describe specific actions to be made in the form of conditions that help achieve the desired outcome. The rules help to change from any undesirable or negative state to a more desirable or positive state. In this paper, we present a Lexicon based weighted scheme approach to identify emotions from customer feedback data in the area of manufacturing business. Also, we use Rough sets and explore the attribute selection method for large scale datasets. Then we apply Actionable pattern mining to extract possible emotion change recommendations. This kind of recommendations help business analyst to improve their customer service which leads to customer satisfaction and increase sales revenue.

Keywords: actionable pattern discovery, attribute selection, business data, data mining, emotion

Procedia PDF Downloads 170
29032 Distributional and Dynamic impact of Energy Subsidy Reform

Authors: Ali Hojati Najafabadi, Mohamad Hosein Rahmati, Seyed Ali Madanizadeh

Abstract:

Governments execute energy subsidy reforms by either increasing energy prices or reducing energy price dispersion. These policies make less use of energy per plant (intensive margin), vary the total number of firms (extensive margin), promote technological progress (technology channel), and make additional resources to redistribute (resource channel). We estimate a structural dynamic firm model with endogenous technology adaptation using data from the manufacturing firms in Iran and a country ranked the second-largest energy subsidy plan by the IMF. The findings show significant dynamics and distributional effects due to an energy reform plan. The price elasticity of energy consumption in the industrial sector is about -2.34, while it is -3.98 for large firms. The dispersion elasticity, defined as the amounts of changes in energy consumption by a one-percent reduction in the standard error of energy price distribution, is about 1.43, suggesting significant room for a distributional policy. We show that the intensive margin is the main driver of energy price elasticity, whereas the other channels mostly offset it. In contrast, the labor response is mainly through the extensive margin. Total factor productivity slightly improves in light of the reduction in energy consumption if, at the same time, the redistribution policy boosts the aggregate demands.

Keywords: energy reform, firm dynamics, structural estimation, subsidy policy

Procedia PDF Downloads 71
29031 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection

Authors: Hamidullah Binol, Abdullah Bal

Abstract:

Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.

Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods

Procedia PDF Downloads 403
29030 Information Visualization Methods Applied to Nanostructured Biosensors

Authors: Osvaldo N. Oliveira Jr.

Abstract:

The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.

Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique

Procedia PDF Downloads 305
29029 A Novel Heuristic for Analysis of Large Datasets by Selecting Wrapper-Based Features

Authors: Bushra Zafar, Usman Qamar

Abstract:

Large data sample size and dimensions render the effectiveness of conventional data mining methodologies. A data mining technique are important tools for collection of knowledgeable information from variety of databases and provides supervised learning in the form of classification to design models to describe vital data classes while structure of the classifier is based on class attribute. Classification efficiency and accuracy are often influenced to great extent by noisy and undesirable features in real application data sets. The inherent natures of data set greatly masks its quality analysis and leave us with quite few practical approaches to use. To our knowledge first time, we present a new approach for investigation of structure and quality of datasets by providing a targeted analysis of localization of noisy and irrelevant features of data sets. Machine learning is based primarily on feature selection as pre-processing step which offers us to select few features from number of features as a subset by reducing the space according to certain evaluation criterion. The primary objective of this study is to trim down the scope of the given data sample by searching a small set of important features which may results into good classification performance. For this purpose, a heuristic for wrapper-based feature selection using genetic algorithm and for discriminative feature selection an external classifier are used. Selection of feature based on its number of occurrence in the chosen chromosomes. Sample dataset has been used to demonstrate proposed idea effectively. A proposed method has improved average accuracy of different datasets is about 95%. Experimental results illustrate that proposed algorithm increases the accuracy of prediction of different diseases.

Keywords: data mining, generic algorithm, KNN algorithms, wrapper based feature selection

Procedia PDF Downloads 295
29028 Composition and in Vitro Antimicrobial Activity of Three Eryngium L. Species

Authors: R. Mickiene, A. Friese, U. Rosler, A. Maruska, O. Ragazinskiene

Abstract:

This research focuses on phytochemistry and antimicrobial activities of compounds isolated and identified from three species of Eryngium. The antimicrobial activity of extracts from Eryngiumplanum L., Eryngium maritimum L., Eryngium campestre L. grown in Lithuania, were tested by the method of series dilutions, against different bacteria species: Escherichia coli, Proteus vulgaris and Staphylococcus aureus with and without antibiotic resistances, originating from livestock. The antimicrobial activity of extracts was described by determination of the minimal inhibitory concentration. Preliminary results show that the minimal inhibitory concentration range between 8.0 % and 17.0 % for the different Eryngium extracts and bacterial species.The total amounts ofphenolic compounds and total amounts of flavonoids were tested in the methanolic extracts of the plants. Identification and evaluation of the phenolic compounds were performed by liquid chromatography. The essential oils were analyzed by gas chromatography mass spectrometry.

Keywords: antimicrobial activities, Eryngium L. species, essential oils, gas chromatography mass spectrometry

Procedia PDF Downloads 417
29027 The Economic Valuation of Public Support Ecosystem: A Contingent Valuation Study in Setiu Wetland, Terengganu Malaysia

Authors: Elmira Shamshity

Abstract:

This study aimed to explore the economic approach for the Setiu wetland evaluation as a future protection strategy. A questionnaire survey was used based on the single-bounded dichotomous choice, contingent valuation method to differentiate individuals’ Willingness to Pay (WTP) for the conservation of the Setiu wetland. The location of study was Terengganu province in Malaysia. The results of the random questionnaire survey showed that protection of Setiu ecosystem is important to the indigenous community. The mean WTP for protection of ecosystem Setiu wetland was 12.985 Ringgit per month per household for 10 years. There was significant variation in the stated amounts of WTP based on the respondents’ knowledge, household income, educational level, and the bid amounts. The findings of this study may help improving understanding the WTP of indigenous people for the protection of wetland, and providing useful information for policy makers to design an effective program of ecosystem protection.

Keywords: willingness to pay, ecosystem, setiu wetland, Terengganu Malaysia

Procedia PDF Downloads 566
29026 Design, Construction, Validation And Use Of A Novel Portable Fire Effluent Sampling Analyser

Authors: Gabrielle Peck, Ryan Hayes

Abstract:

Current large scale fire tests focus on flammability and heat release measurements. Smoke toxicity isn’t considered despite it being a leading cause of death and injury in unwanted fires. A key reason could be that the practical difficulties associated with quantifying individual toxic components present in a fire effluent often require specialist equipment and expertise. Fire effluent contains a mixture of unreactive and reactive gases, water, organic vapours and particulate matter, which interact with each other. This interferes with the operation of the analytical instrumentation and must be removed without changing the concentration of the target analyte. To mitigate the need for expensive equipment and time-consuming analysis, a portable gas analysis system was designed, constructed and tested for use in large-scale fire tests as a simpler and more robust alternative to online FTIR measurements. The novel equipment aimed to be easily portable and able to run on battery or mains electricity; be able to be calibrated at the test site; be capable of quantifying CO, CO2, O2, HCN, HBr, HCl, NOx and SO2 accurately and reliably; be capable of independent data logging; be capable of automated switchover of 7 bubblers; be able to withstand fire effluents; be simple to operate; allow individual bubbler times to be pre-set; be capable of being controlled remotely. To test the analysers functionality, it was used alongside the ISO/TS 19700 Steady State Tube Furnace (SSTF). A series of tests were conducted to assess the validity of the box analyser measurements and the data logging abilities of the apparatus. PMMA and PA 6.6 were used to assess the validity of the box analyser measurements. The data obtained from the bench-scale assessments showed excellent agreement. Following this, the portable analyser was used to monitor gas concentrations during large-scale testing using the ISO 9705 room corner test. The analyser was set up, calibrated and set to record smoke toxicity measurements in the doorway of the test room. The analyser was successful in operating without manual interference and successfully recorded data for 12 of the 12 tests conducted in the ISO room tests. At the end of each test, the analyser created a data file (formatted as .csv) containing the measured gas concentrations throughout the test, which do not require specialist knowledge to interpret. This validated the portable analyser’s ability to monitor fire effluent without operator intervention on both a bench and large-scale. The portable analyser is a validated and significantly more practical alternative to FTIR, proven to work for large-scale fire testing for quantification of smoke toxicity. The analyser is a cheaper, more accessible option to assess smoke toxicity, mitigating the need for expensive equipment and specialist operators.

Keywords: smoke toxicity, large-scale tests, iso 9705, analyser, novel equipment

Procedia PDF Downloads 46
29025 Study of Temperature and Precipitation Changes Based on the Scenarios (IPCC) in the Caspian Sea City: Case Study in Gillan Province

Authors: Leila Rashidian, Mina Rajabali

Abstract:

Industrialization has made progress and comfort for human beings in many aspects. It is not only achievement for the global environment but also factor for destruction and disruption of the Earth's climate. In this study, we used LARS.WG model and down scaling of general circulation climate model HADCM-3 daily precipitation amounts, minimum and maximum temperature and daily sunshine hours. These data are provided by the meteorological organization for Caspian Sea coastal station such as Anzali, Manjil, Rasht, Lahijan and Astara since their establishment is from 1982 until 2010. According to the IPCC scenarios, including series A1b, A2, B1, we tried to simulate data from 2010 to 2040. The rainfall pattern has changed. So we have a rainfall distribution inappropriate in different months.

Keywords: climate change, Lars.WG, HADCM3, Gillan province, climatic parameters, A2 scenario

Procedia PDF Downloads 239
29024 Impact of Chronic Pollution on the Taj Mahal, India

Authors: Kiran P. Chadayamuri, Saransh Bagdi, Sai Vinod Boddu

Abstract:

Pollution has been a major problem that has haunted India for years. Large amounts of industrial, automobile and domestic waste have resulted in heavy contamination of air, land and water. The Taj Mahal, one of the Seven Wonders of the World, has been and continues to be India’s symbol of a rich history around the globe. Over the years, the beauty of Taj Mahal has also suffered from increasing pollution. Its shiny white exterior has started to turn yellow because of air pollution and acid rain. Illegal factories and uncontrolled construction have played a major role in worsening its condition. Rapid population growth in the city (Agra) meant more water requirement which has led to ground water deterioration under the historical monument making its wooden foundations dry and weak. Despite various measures by the state and central government, there hasn’t been any satisfactory result. This paper aims at studying the various causes and their impacts affecting the Taj Mahal and method that could slow down its deterioration.

Keywords: pollution, Taj Mahal, India, management

Procedia PDF Downloads 359
29023 Institutional and Economic Determinants of Foreign Direct Investment: Comparative Analysis of Three Clusters of Countries

Authors: Ismatilla Mardanov

Abstract:

There are three types of countries, the first of which is willing to attract foreign direct investment (FDI) in enormous amounts and do whatever it takes to make this happen. Therefore, FDI pours into such countries. In the second cluster of countries, even if the country is suffering tremendously from the shortage of investments, the governments are hesitant to attract investments because they are at the hands of local oligarchs/cartels. Therefore, FDI inflows are moderate to low in such countries. The third type is countries whose companies prefer investing in the most efficient locations globally and are hesitant to invest in the homeland. Sorting countries into such clusters, the present study examines the essential institutions and economic factors that make these countries different. Past literature has discussed various determinants of FDI in all kinds of countries. However, it did not classify countries based on government motivation, institutional setup, and economic factors. A specific approach to each target country is vital for corporate foreign direct investment risk analysis and decisions. The research questions are 1. What specific institutional and economic factors paint the pictures of the three clusters; 2. What specific institutional and economic factors are determinants of FDI; 3. Which of the determinants are endogenous and exogenous variables? 4. How can institutions and economic and political variables impact corporate investment decisions Hypothesis 1: In the first type, country institutions and economic factors will be favorable for FDI. Hypothesis 2: In the second type, even if country economic factors favor FDI, institutions will not. Hypothesis 3: In the third type, even if country institutions favorFDI, economic factors will not favor domestic investments. Therefore, FDI outflows occur in large amounts. Methods: Data come from open sources of the World Bank, the Fraser Institute, the Heritage Foundation, and other reliable sources. The dependent variable is FDI inflows. The independent variables are institutions (economic and political freedom indices) and economic factors (natural, material, and labor resources, government consumption, infrastructure, minimum wage, education, unemployment, tax rates, consumer price index, inflation, and others), the endogeneity or exogeneity of which are tested in the instrumental variable estimation. Political rights and civil liberties are used as instrumental variables. Results indicate that in the first type, both country institutions and economic factors, specifically labor and logistics/infrastructure/energy intensity, are favorable for potential investors. In the second category of countries, the risk of loss of assets is very high due to governmentshijacked by local oligarchs/cartels/special interest groups. In the third category of countries, the local economic factors are unfavorable for domestic investment even if the institutions are well acceptable. Cluster analysis and instrumental variable estimation were used to reveal cause-effect patterns in each of the clusters.

Keywords: foreign direct investment, economy, institutions, instrumental variable estimation

Procedia PDF Downloads 137
29022 Closed Loop Large Bowel Obstruction Due to Appendiceal Signet Cell Carcinoma

Authors: Joshua Teo, Leo Phan

Abstract:

Signet cell carcinoma of the appendix is the rarest and the most aggressive subtype of appendiceal malignancy, typically with non-specific presentations. We describe a case of a 62-year-old male with large bowel obstruction and CT demonstrating dilated large bowels from caecum to proximal sigmoid colon with pneumoperitoneum. Intra-operatively, closed-loop obstruction caused by dense adherence of sigmoid colon to caecum was noted, which had resulted in caecal perforation. Histopathology study indicated primary appendiceal malignancy of signet cell morphology with intra-peritoneal spread to the sigmoid colon. Large bowel obstruction from appendiceal malignancy has rarely been reported, and a similar presentation has not been described in the existing literature. When left-sided large bowel obstruction is suspected to be caused by a malignant stricture, it is essential to consider transperitoneal spread of appendiceal malignancy as potential aetiology, particularly in the elderly.

Keywords: appendiceal carcinoma, large bowel obstruction, signet ring cell cancer, caecal perforation

Procedia PDF Downloads 191
29021 Reexamining Contrarian Trades as a Proxy of Informed Trades: Evidence from China's Stock Market

Authors: Dongqi Sun, Juan Tao, Yingying Wu

Abstract:

This paper reexamines the appropriateness of contrarian trades as a proxy of informed trades, using high frequency Chinese stock data. Employing this measure for 5 minute intervals, a U-shaped intraday pattern of probability of informed trades (PIN) is found for the CSI300 stocks, which is consistent with previous findings for other markets. However, while dividing the trades into different sizes, a reversed U-shaped PIN from large-sized trades, opposed to the U-shaped pattern for small- and medium-sized trades, is observed. Drawing from the mixed evidence with different trade sizes, the price impact of trades is further investigated. By examining the relationship between trade imbalances and unexpected returns, larges-sized trades are found to have significant price impact. This implies that in those intervals with large trades, it is non-contrarian trades that are more likely to be informed trades. Taking account of the price impact of large-sized trades, non-contrarian trades are used to proxy for informed trading in those intervals with large trades, and contrarian trades are still used to measure informed trading in other intervals. A stronger U-shaped PIN is demonstrated from this modification. Auto-correlation and information advantage tests for robustness also support the modified informed trading measure.

Keywords: contrarian trades, informed trading, price impact, trade imbalance

Procedia PDF Downloads 140