Search results for: minority forms of information processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15711

Search results for: minority forms of information processing

14961 Using Textual Pre-Processing and Text Mining to Create Semantic Links

Authors: Ricardo Avila, Gabriel Lopes, Vania Vidal, Jose Macedo

Abstract:

This article offers a approach to the automatic discovery of semantic concepts and links in the domain of Oil Exploration and Production (E&P). Machine learning methods combined with textual pre-processing techniques were used to detect local patterns in texts and, thus, generate new concepts and new semantic links. Even using more specific vocabularies within the oil domain, our approach has achieved satisfactory results, suggesting that the proposal can be applied in other domains and languages, requiring only minor adjustments.

Keywords: semantic links, data mining, linked data, SKOS

Procedia PDF Downloads 155
14960 Handling, Exporting and Archiving Automated Mineralogy Data Using TESCAN TIMA

Authors: Marek Dosbaba

Abstract:

Within the mining sector, SEM-based Automated Mineralogy (AM) has been the standard application for quickly and efficiently handling mineral processing tasks. Over the last decade, the trend has been to analyze larger numbers of samples, often with a higher level of detail. This has necessitated a shift from interactive sample analysis performed by an operator using a SEM, to an increased reliance on offline processing to analyze and report the data. In response to this trend, TESCAN TIMA Mineral Analyzer is designed to quickly create a virtual copy of the studied samples, thereby preserving all the necessary information. Depending on the selected data acquisition mode, TESCAN TIMA can perform hyperspectral mapping and save an X-ray spectrum for each pixel or segment, respectively. This approach allows the user to browse through elemental distribution maps of all elements detectable by means of energy dispersive spectroscopy. Re-evaluation of the existing data for the presence of previously unconsidered elements is possible without the need to repeat the analysis. Additional tiers of data such as a secondary electron or cathodoluminescence images can also be recorded. To take full advantage of these information-rich datasets, TIMA utilizes a new archiving tool introduced by TESCAN. The dataset size can be reduced for long-term storage and all information can be recovered on-demand in case of renewed interest. TESCAN TIMA is optimized for network storage of its datasets because of the larger data storage capacity of servers compared to local drives, which also allows multiple users to access the data remotely. This goes hand in hand with the support of remote control for the entire data acquisition process. TESCAN also brings a newly extended open-source data format that allows other applications to extract, process and report AM data. This offers the ability to link TIMA data to large databases feeding plant performance dashboards or geometallurgical models. The traditional tabular particle-by-particle or grain-by-grain export process is preserved and can be customized with scripts to include user-defined particle/grain properties.

Keywords: Tescan, electron microscopy, mineralogy, SEM, automated mineralogy, database, TESCAN TIMA, open format, archiving, big data

Procedia PDF Downloads 94
14959 Syntactic, Semantic, and Pragmatic Rationalization of Modal Auxiliary Verbs in Akan

Authors: Joana Portia Sakyi

Abstract:

The uniqueness of auxiliary verbs and their contribution to grammar as constituents, which act as preverbs to supply additional grammatical or functional meanings to clauses, are well established. Functionally, they relate clauses to tense, aspect, mood, voice, emphasis, and modality, along with the main verbs conveying the appropriate lexical content. There has been an issue in Akan grammar vis-à-vis the status of auxiliary verbs, in terms of whether Akan has auxiliaries or not and even which forms are to be regarded as auxiliaries. We investigate the syntactic, semantic, and pragmatic components of expressions and claim that Akan has auxiliary verbs that contribute the functional or grammatical meaning of modality, tense/aspect, etc., to clauses they occur in. Essentially, we use a self-created corpus data to consider the affix bέ- ‘may’, ‘must’, ‘should’; the form tùmí ‘can’, ‘be able to’; mà ‘to let’, ‘to allow’, ‘to permit’, ‘to make’, or ‘to cause’ someone to do something; the multi-word forms ὲsὲ sέ ‘must’, ‘should’ or ‘have to’ and ètwà sέ ‘must’, ‘should’ or ‘have to’, and assert that they are legitimate modal auxiliaries conveying epistemic, deontic, and dynamic modalities, as well as other meanings in the language.

Keywords: Akan, modality, modal auxiliaries, semantics

Procedia PDF Downloads 53
14958 3D Printing of Dual Tablets: Modified Multiple Release Profiles for Personalized Medicine

Authors: Veronika Lesáková, Silvia Slezáková, František Štěpánek

Abstract:

Additive manufacturing technologies producing drug dosage forms aimed at personalized medicine applications are promising strategies with several advantages over the conventional production methods. One of the emerging technologies is 3D printing which reduces manufacturing steps and thus allows a significant drop in expenses. A decrease in material consumption is also a highly impactful benefit as the tested drugs are frequently expensive substances. In addition, 3D printed dosage forms enable increased patient compliance and prevent misdosing as the dosage forms are carefully designed according to the patient’s needs. The incorporation of multiple drugs into a single dosage form further increases the degree of personalization. Our research focuses on the development of 3D printed tablets incorporating multiple drugs (candesartan, losartan) and thermoplastic polymers (e.g., KlucelTM HPC EF). The filaments, an essential feed material for 3D printing,wereproduced via hot-melt extrusion. Subsequently, the extruded filaments of various formulations were 3D printed into tablets using an FDM 3D printer. Then, we have assessed the influence of the internal structure of 3D printed tablets and formulation on dissolution behaviour by obtaining the dissolution profiles of drugs present in the 3D printed tablets. In conclusion, we have developed tablets containing multiple drugs providing modified release profiles. The 3D printing experiments demonstrate the high tunability of 3D printing as each tablet compartment is constructed with a different formulation. Overall, the results suggest that the 3D printing technology is a promising manufacturing approach to dual tablet preparation for personalized medicine.

Keywords: 3D printing, drug delivery, hot-melt extrusion, dissolution kinetics

Procedia PDF Downloads 154
14957 Strategies of Risk Management for Smallholder Farmers in South Africa: A Case Study on Pigeonpea (Cajanus cajan) Production

Authors: Sanari Chalin Moriri, Kwabena Kingsley Ayisi, Alina Mofokeng

Abstract:

Dryland smallholder farmers in South Africa are vulnerable to all kinds of risks, and it negatively affects crop productivity and profit. Pigeonpea is a leguminous and multipurpose crop that provides food, fodder, and wood for smallholder farmers. The majority of these farmers are still growing pigeonpea from traditional unimproved seeds, which comprise a mixture of genotypes. The objectives of the study were to identify the key risk factors that affect pigeonpea productivity and to develop management strategies on how to alleviate the risk factors in pigeonpea production. The study was conducted in two provinces (Limpopo and Mpumalanga) of South Africa in six municipalities during the 2020/2021 growing seasons. The non-probability sampling method using purposive and snowball sampling techniques were used to collect data from the farmers through a structured questionnaire. A total of 114 pigeonpea producers were interviewed individually using a questionnaire. Key stakeholders in each municipality were also identified, invited, and interviewed to verify the information given by farmers. Data collected were subjected to SPSS statistical software 25 version. The findings of the study were that majority of farmers affected by risk factors were women, subsistence, and old farmers resulted in low food production. Drought, unavailability of improved pigeonpea seeds for planting, access to information, and processing equipment were found to be the main risk factors contributing to low crop productivity in farmer’s fields. Above 80% of farmers lack knowledge on the improvement of the crop and also on the processing techniques to secure high prices during the crop off-season. Market availability, pricing, and incidence of pests and diseases were found to be minor risk factors which were triggered by the major risk factors. The minor risk factors can be corrected only if the major risk factors are first given the necessary attention. About 10% of the farmers found to use the crop as a mulch to reduce soil temperatures and to improve soil fertility. The study revealed that most of the farmers were unaware of its utilisation as fodder, much, medicinal, nitrogen fixation, and many more. The risk of frequent drought in dry areas of South Africa where farmers solely depend on rainfall poses a serious threat to crop productivity. The majority of these risk factors are caused by climate change due to unrealistic, low rainfall with extreme temperatures poses a threat to food security, water, and the environment. The use of drought-tolerant, multipurpose legume crops such as pigeonpea, access to new information, provision of processing equipment, and support from all stakeholders will help in addressing food security for smallholder farmers. Policies should be revisited to address the prevailing risk factors faced by farmers and involve them in addressing the risk factors. Awareness should be prioritized in promoting the crop to improve its production and commercialization in the dryland farming system of South Africa.

Keywords: management strategies, pigeonpea, risk factors, smallholder farmers

Procedia PDF Downloads 195
14956 Relations of Progression in Cognitive Decline with Initial EEG Resting-State Functional Network in Mild Cognitive Impairment

Authors: Chia-Feng Lu, Yuh-Jen Wang, Yu-Te Wu, Sui-Hing Yan

Abstract:

This study aimed at investigating whether the functional brain networks constructed using the initial EEG (obtained when patients first visited hospital) can be correlated with the progression of cognitive decline calculated as the changes of mini-mental state examination (MMSE) scores between the latest and initial examinations. We integrated the time–frequency cross mutual information (TFCMI) method to estimate the EEG functional connectivity between cortical regions, and the network analysis based on graph theory to investigate the organization of functional networks in aMCI. Our finding suggested that higher integrated functional network with sufficient connection strengths, dense connection between local regions, and high network efficiency in processing information at the initial stage may result in a better prognosis of the subsequent cognitive functions for aMCI. In conclusion, the functional connectivity can be a useful biomarker to assist in prediction of cognitive declines in aMCI.

Keywords: cognitive decline, functional connectivity, MCI, MMSE

Procedia PDF Downloads 367
14955 A Pattern Practise for Awareness Educations on Information Security: Information Security Project

Authors: Fati̇h Apaydin

Abstract:

Education technology is an area which constantly changes and creates innovations. As an inevitable part of the changing circumstances, the societies who have a tendency to the improvements keep up with these innovations by using the methods and strategies which have been designed for education technology. At this point, education technology has taken the responsibility to help the individuals improve themselves and teach the effective teaching methods by filling the airs in theoretical information, information security and the practice. The technology which comes to the core of our lives by raising the importance of it day by day and it enforced its position in computer- based environments. As a result, ‘being ready for technological innovations, improvement on computer-based talent, information, ability and attitude’ doctrines have to be given. However, it is today quite hard to deal with the security and reinforcement of this information. The information which is got illegally gives harm to society from every aspect, especially education. This study includes how and to what extent to use these innovative appliances such as computers and the factor of information security of these appliances in computer-based education. As the use of computer is constantly becoming prevalent in our country, both education and computer will never become out of date, so how computer-based education affects our lives and the study of information security for this type of education are important topics.

Keywords: computer, information security, education, technology, development

Procedia PDF Downloads 579
14954 Design and Implementation of an Effective Machine Learning Approach to Crime Prediction and Prevention

Authors: Ashish Kumar, Kaptan Singh, Amit Saxena

Abstract:

Today, it is believed that crimes have the greatest impact on a person's ability to progress financially and personally. Identifying places where individuals shouldn't go is crucial for preventing crimes and is one of the key considerations. As society and technologies have advanced significantly, so have crimes and the harm they wreak. When there is a concentration of people in one place and changes happen quickly, it is even harder to prevent. Because of this, many crime prevention strategies have been embraced as a component of the development of smart cities in numerous cities. However, crimes can occur anywhere; all that is required is to identify the pattern of their occurrences, which will help to lower the crime rate. In this paper, an analysis related to crime has been done; information related to crimes is collected from all over India that can be accessed from anywhere. The purpose of this paper is to investigate the relationship between several factors and India's crime rate. The review has covered information related to every state of India and their associated regions of the period going in between 2001- 2014. However various classes of violations have a marginally unique scope over the years.

Keywords: K-nearest neighbor, random forest, decision tree, pre-processing

Procedia PDF Downloads 71
14953 How the Writer Tells the Story Should Be the Primary Concern rather than Who Can Write about Whom: The Limits of Cultural Appropriation Vis-à-Vis The Ethics of Narrative Empathy

Authors: Alexandra Cheira

Abstract:

Cultural appropriation has been theorised as a form of colonialism in which members of a dominant culture reduce cultural elements that are deeply meaningful to a minority culture to the category of the “exotic other” since they do not experience the oppression and discriminations faced by members of the minority culture. Yet, in the particular case of literature, writers such as Lionel Shriver and Bernardine Evaristo have argued that authors from a cultural majority have a right to write in the voice of someone from a cultural minority, hence attacking the idea that this is a form of cultural appropriation. By definition, Shriver and Evaristo claim, writers are supposed to write beyond their own culture, gender, class, and/ or race. In this light, this paper discusses the limits of cultural appropriation vis-à-vis the ethics of narrative empathy by addressing the mixed critical reception of Kathryn Stockett’s The Help (2009) and Jeanine Cummins’s American Dirt (2020). In fact, both novels were acclaimed as global eye-openers regarding the struggles of respectively South American migrants and African American maids. At the same time, both novelists have been accused of cultural appropriation by telling a story that is not theirs to tell, given the fact that they are white women telling these stories in what critics have argued is really an American voice telling a story to American readers.These claims will be investigated within the framework of Edward Said’s foundational examination of Orientalism in the field of postcolonial studies as a Western style for authoritatively restructuring the Orient. This means that Orientalist stereotypes regarding Eastern cultures have implicitly validated colonial and imperial pursuits, in the specific context of literary representations of African American and Mexican cultures by white writers. At the same time, the conflicted reception of American Dirt and The Help will be examined within the critical framework of narrative empathy as theorised by Suzanne Keen. Hence, there will be a particular focus on the way a reader’s heated perception that the author’s perspective is purely dishonest can result from a friction between an author’s intention and a reader’s experience of narrative empathy, while a shared sense of empathy between authors and readers can be a rousing momentum to move beyond literary response to social action.Finally, in order to assess that “the key question should not be who can write about whom, but how the writer tells the story”, the recent controversy surrounding Dutch author Marieke Lucas Rijneveld’s decision to resign the translation of American poet Amanda Gorman’s work into Dutch will be duly investigated. In fact, Rijneveld stepped out after journalist and activist Janice Deul criticised Dutch publisher Meulenhoff for choosing a translator who was not also Black, despite the fact that 22-year-old Gorman had selected the 29-year-old Rijneveld herself, as a fellow young writer who had likewise come to fame early on in life. In this light, the critical argument that the controversial reception of The Help reveals as much about US race relations in the early twenty-first century as about the complex literary transactions between individual readers and the novel itself will also be discussed in the extended context of American Dirt and white author Marieke Rijneveld’s withdrawal from the projected translation of Black poet Amanda Gorman.

Keywords: cultural appropriation, cultural stereotypes, narrative empathy, race relations

Procedia PDF Downloads 50
14952 Timescape-Based Panoramic View for Historic Landmarks

Authors: H. Ali, A. Whitehead

Abstract:

Providing a panoramic view of famous landmarks around the world offers artistic and historic value for historians, tourists, and researchers. Exploring the history of famous landmarks by presenting a comprehensive view of a temporal panorama merged with geographical and historical information presents a unique challenge of dealing with images that span a long period, from the 1800’s up to the present. This work presents the concept of temporal panorama through a timeline display of aligned historic and modern images for many famous landmarks. Utilization of this panorama requires a collection of hundreds of thousands of landmark images from the Internet comprised of historic images and modern images of the digital age. These images have to be classified for subset selection to keep the more suitable images that chronologically document a landmark’s history. Processing of historic images captured using older analog technology under various different capturing conditions represents a big challenge when they have to be used with modern digital images. Successful processing of historic images to prepare them for next steps of temporal panorama creation represents an active contribution in cultural heritage preservation through the fulfillment of one of UNESCO goals in preservation and displaying famous worldwide landmarks.

Keywords: cultural heritage, image registration, image subset selection, registered image similarity, temporal panorama, timescapes

Procedia PDF Downloads 150
14951 Methods and Algorithms of Ensuring Data Privacy in AI-Based Healthcare Systems and Technologies

Authors: Omar Farshad Jeelani, Makaire Njie, Viktoriia M. Korzhuk

Abstract:

Recently, the application of AI-powered algorithms in healthcare continues to flourish. Particularly, access to healthcare information, including patient health history, diagnostic data, and PII (Personally Identifiable Information) is paramount in the delivery of efficient patient outcomes. However, as the exchange of healthcare information between patients and healthcare providers through AI-powered solutions increases, protecting a person’s information and their privacy has become even more important. Arguably, the increased adoption of healthcare AI has resulted in a significant concentration on the security risks and protection measures to the security and privacy of healthcare data, leading to escalated analyses and enforcement. Since these challenges are brought by the use of AI-based healthcare solutions to manage healthcare data, AI-based data protection measures are used to resolve the underlying problems. Consequently, this project proposes AI-powered safeguards and policies/laws to protect the privacy of healthcare data. The project presents the best-in-school techniques used to preserve the data privacy of AI-powered healthcare applications. Popular privacy-protecting methods like Federated learning, cryptographic techniques, differential privacy methods, and hybrid methods are discussed together with potential cyber threats, data security concerns, and prospects. Also, the project discusses some of the relevant data security acts/laws that govern the collection, storage, and processing of healthcare data to guarantee owners’ privacy is preserved. This inquiry discusses various gaps and uncertainties associated with healthcare AI data collection procedures and identifies potential correction/mitigation measures.

Keywords: data privacy, artificial intelligence (AI), healthcare AI, data sharing, healthcare organizations (HCOs)

Procedia PDF Downloads 62
14950 Material Characterization of Medical Grade Woven Bio-Fabric for Use in ABAQUS *FABRIC Material Model

Authors: Lewis Wallace, William Dempster, David Nash, Alexandros Boukis, Craig Maclean

Abstract:

This paper, through traditional test methods and close adherence to international standards, presents a characterization study of a woven Polyethylene Terephthalate (PET). Testing is undergone in the axial, shear, and out-of-plane (bend) directions, and the results are fitted to the *FABRIC material model with ABAQUS FEA. The non-linear behaviors of the fabric in the axial and shear directions and behaviors on the macro scale are explored at the meso scale level. The medical grade bio-fabric is tested in untreated and heat-treated forms, and deviations are closely analyzed at the micro, meso, and macro scales to determine the effects of the process. The heat-treatment process was found to increase the stiffness of the fabric during axial and bending stiffness testing but had a negligible effect on the shear response. The ability of *FABRIC to capture behaviors unique to fabric deformation is discussed, whereby the unique phenomenological input can accurately represent the experimentally derived inputs.

Keywords: experimental techniques, FEA modelling, materials characterization, post-processing techniques

Procedia PDF Downloads 81
14949 Cultural Competence and Healthcare Challenges of Migrants in South Wales United Kingdom

Authors: Qirat Naz, Abasiokpon Udoakah

Abstract:

In developed countries, global migration is diversifying. The minority ethnic population, including refugees and asylum seekers who, fled their home countries due to war, terrorism, oppression, or natural disasters, and returning home is dangerous for them. They need sanctuary and peaceful environment in host countries. They begin the process of acculturation, in which a person adopts the social mores and behavioral patterns of the dominant culture, yet they still have unique multicultural needs that the dominant society fails to address. The aim of this research is to provide a holistic understanding of the living experiences of a minority population, particularly migrants, including asylum seekers and refugees, in the health and social care system of South Wales. The purpose of this study is to investigate three research objectives: the multicultural health care needs of minorities, as well as the barriers to seeking health and social care facilities. There are Welsh policies for promoting cultural competence in the health and social care sectors; this research will explore the implications and impact of these policies on the target population. This research study will be conducted using qualitative research methods, tools, and techniques. This research is an inductive approach to coming up with a grounded theory. The sample will be divided into two groups: migrants and professionals providing any kind of services to migrants; each group will contain 30 participants. Interpretive phenomenological analysis would be utilized during the process of coding and developing the main themes of this research. The positionality of the researcher would be minimized by unloaded and open-ended questions, researcher’s work experience in research, continuous evaluation of her positionality, daily base reflection of fieldwork and seeking the help of male and female gatekeepers. The research findings would be based on emic perspective, and by documenting the emic perspective of minorities, this research will contribute to the knowledge of appropriate channels, including organizations, academics, and policymakers, to discover possible solutions and coping mechanisms to deal with the challenges and meet the multicultural demands of minorities. This research will provide a more in-depth understanding of minorities and will help to promote the diversity of health and social care in South Wales.

Keywords: migration, migrants, cultural competence, cultural barriers, healthcare challenges

Procedia PDF Downloads 48
14948 Monomial Form Approach to Rectangular Surface Modeling

Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong

Abstract:

Geometric modeling plays an important role in the constructions and manufacturing of curve, surface and solid modeling. Their algorithms are critically important not only in the automobile, ship and aircraft manufacturing business, but are also absolutely necessary in a wide variety of modern applications, e.g., robotics, optimization, computer vision, data analytics and visualization. The calculation and display of geometric objects can be accomplished by these six techniques: Polynomial basis, Recursive, Iterative, Coefficient matrix, Polar form approach and Pyramidal algorithms. In this research, the coefficient matrix (simply called monomial form approach) will be used to model polynomial rectangular patches, i.e., Said-Ball, Wang-Ball, DP, Dejdumrong and NB1 surfaces. Some examples of the monomial forms for these surface modeling are illustrated in many aspects, e.g., construction, derivatives, model transformation, degree elevation and degress reduction.

Keywords: monomial forms, rectangular surfaces, CAGD curves, monomial matrix applications

Procedia PDF Downloads 136
14947 Information Overload, Information Literacy and Use of Technology by Students

Authors: Elena Krelja Kurelović, Jasminka Tomljanović, Vlatka Davidović

Abstract:

The development of web technologies and mobile devices makes creating, accessing, using and sharing information or communicating with each other simpler every day. However, while the amount of information constantly increasing it is becoming harder to effectively organize and find quality information despite the availability of web search engines, filtering and indexing tools. Although digital technologies have overall positive impact on students’ lives, frequent use of these technologies and digital media enriched with dynamic hypertext and hypermedia content, as well as multitasking, distractions caused by notifications, calls or messages; can decrease the attention span, make thinking, memorizing and learning more difficult, which can lead to stress and mental exhaustion. This is referred to as “information overload”, “information glut” or “information anxiety”. Objective of this study is to determine whether students show signs of information overload and to identify the possible predictors. Research was conducted using a questionnaire developed for the purpose of this study. The results show that students frequently use technology (computers, gadgets and digital media), while they show moderate level of information literacy. They have sometimes experienced symptoms of information overload. According to the statistical analysis, higher frequency of technology use and lower level of information literacy are correlated with larger information overload. The multiple regression analysis has confirmed that the combination of these two independent variables has statistically significant predictive capacity for information overload. Therefore, the information science teachers should pay attention to improving the level of students’ information literacy and educate them about the risks of excessive technology use.

Keywords: information overload, computers, mobile devices, digital media, information literacy, students

Procedia PDF Downloads 260
14946 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications

Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley

Abstract:

Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.

Keywords: batteries, energy, iron, nickel, storage

Procedia PDF Downloads 427
14945 A Review of Information Systems Development in Developing Countries

Authors: B. N. Asare, O. A. Ajigini

Abstract:

Information systems (IS) are highly important in the operation of private and public organisations in developing and developed countries. Developing countries are saddled with many project failures during the implementation of information systems. However, successful information systems are greatly needed in developing countries in order to enhance their economies. This paper is highly important in view of the high failure rate of information systems in developing countries which needs to be reduced to minimum acceptable levels by means of recommended interventions. This paper centres on a review of IS development in developing countries. The paper presents evidences of the IS successes and failures in developing countries and posits a model to address the IS failures. The proposed model can then be utilised by developing countries to reduce their IS project implementation failure rate. A comparison is drawn between IS development in developing countries and developed countries. The paper provides valuable information to assist in reducing IS failure, and developing IS models and theories on IS development for developing countries.

Keywords: developing countries, information systems, IS development, information systems failure, information systems success, information systems success model

Procedia PDF Downloads 352
14944 Numerical Implementation and Testing of Fractioning Estimator Method for the Box-Counting Dimension of Fractal Objects

Authors: Abraham Terán Salcedo, Didier Samayoa Ochoa

Abstract:

This work presents a numerical implementation of a method for estimating the box-counting dimension of self-avoiding curves on a planar space, fractal objects captured on digital images; this method is named fractioning estimator. Classical methods of digital image processing, such as noise filtering, contrast manipulation, and thresholding, among others, are used in order to obtain binary images that are suitable for performing the necessary computations of the fractioning estimator. A user interface is developed for performing the image processing operations and testing the fractioning estimator on different captured images of real-life fractal objects. To analyze the results, the estimations obtained through the fractioning estimator are compared to the results obtained through other methods that are already implemented on different available software for computing and estimating the box-counting dimension.

Keywords: box-counting, digital image processing, fractal dimension, numerical method

Procedia PDF Downloads 68
14943 Bird-Adapted Filter for Avian Species and Individual Identification Systems Improvement

Authors: Ladislav Ptacek, Jan Vanek, Jan Eisner, Alexandra Pruchova, Pavel Linhart, Ludek Muller, Dana Jirotkova

Abstract:

One of the essential steps of avian song processing is signal filtering. Currently, the standard methods of filtering are the Mel Bank Filter or linear filter distribution. In this article, a new type of bank filter called the Bird-Adapted Filter is introduced; whereby the signal filtering is modifiable, based upon a new mathematical description of audiograms for particular bird species or order, which was named the Avian Audiogram Unified Equation. According to the method, filters may be deliberately distributed by frequency. The filters are more concentrated in bands of higher sensitivity where there is expected to be more information transmitted and vice versa. Further, it is demonstrated a comparison of various filters for automatic individual recognition of chiffchaff (Phylloscopus collybita). The average Equal Error Rate (EER) value for Linear bank filter was 16.23%, for Mel Bank Filter 18.71%, the Bird-Adapted Filter gave 14.29%, and Bird-Adapted Filter with 1/3 modification was 12.95%. This approach would be useful for practical use in automatic systems for avian species and individual identification. Since the Bird-Adapted Filter filtration is based on the measured audiograms of particular species or orders, selecting the distribution according to the avian vocalization provides the most precise filter distribution to date.

Keywords: avian audiogram, bird individual identification, bird song processing, bird species recognition, filter bank

Procedia PDF Downloads 375
14942 Predictive Analytics in Oil and Gas Industry

Authors: Suchitra Chnadrashekhar

Abstract:

Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.

Keywords: hydrocarbon, information technology, SAS, predictive analytics

Procedia PDF Downloads 337
14941 River Stage-Discharge Forecasting Based on Multiple-Gauge Strategy Using EEMD-DWT-LSSVM Approach

Authors: Farhad Alizadeh, Alireza Faregh Gharamaleki, Mojtaba Jalilzadeh, Houshang Gholami, Ali Akhoundzadeh

Abstract:

This study presented hybrid pre-processing approach along with a conceptual model to enhance the accuracy of river discharge prediction. In order to achieve this goal, Ensemble Empirical Mode Decomposition algorithm (EEMD), Discrete Wavelet Transform (DWT) and Mutual Information (MI) were employed as a hybrid pre-processing approach conjugated to Least Square Support Vector Machine (LSSVM). A conceptual strategy namely multi-station model was developed to forecast the Souris River discharge more accurately. The strategy used herein was capable of covering uncertainties and complexities of river discharge modeling. DWT and EEMD was coupled, and the feature selection was performed for decomposed sub-series using MI to be employed in multi-station model. In the proposed feature selection method, some useless sub-series were omitted to achieve better performance. Results approved efficiency of the proposed DWT-EEMD-MI approach to improve accuracy of multi-station modeling strategies.

Keywords: river stage-discharge process, LSSVM, discrete wavelet transform, Ensemble Empirical Decomposition Mode, multi-station modeling

Procedia PDF Downloads 160
14940 Mechanical Analysis and Characterization of Friction Stir Processed Aluminium Alloy

Authors: Jaswinder Kumar, Kulbir Singh Sandhu

Abstract:

Friction stir processing (FSP) is a solid-state surface processing technique. A single-pass FSP was performed on Aluminum alloy at combinations of different tool rotational speeds with cylindrical threaded pin profiled tool. The effect of these parameters on tribological properties was studied. The wear resistance is found to be increased from base metal to a single pass FSP sample. The results revealed that with an increase in tool rotational speed, the wear rate increases. The high heat generation causes matrix softening, which results in an increased wear rate; on the other hand, high heat generation leads to coarse grains, which also affected tribological properties. Furthermore, Microstructure results showed that FSPed alloy has a more refined grain structure as compare to the base material, which may be resulted in enhancement of hardness and resistance to wear in FSP.

Keywords: friction stir processing, aluminium alloy, microhardness, microstructure

Procedia PDF Downloads 90
14939 Enterprise Security Architecture: Approaches and a Framework

Authors: Amir Mohtarami, Hadi Kandjani

Abstract:

The amount of business-critical information in enterprises is growing at an extraordinary rate, and the ability to catalog that information and properly protect it using traditional security mechanisms is not keeping pace. Alongside the Information Technology (IT), information security needs a holistic view in enterprise. In other words, a comprehensive architectural approach is required, focusing on the information itself, understanding what the data are, who owns it, and which business and regulatory policies should be applied to the information. Enterprise Architecture Frameworks provide useful tools to grasp different dimensions of IT in organizations. Usually this is done by the layered views on IT architecture, but not requisite security attention has been held in this frameworks. In this paper, after a brief look at the Enterprise Architecture (EA), we discuss the issue of security in the overall enterprise IT architecture. Due to the increasing importance of security, a rigorous EA program in an enterprise should be able to consider security architecture as an integral part of its processes and gives a visible roadmap and blueprint for this aim.

Keywords: enterprise architecture, architecture framework, security architecture, information systems

Procedia PDF Downloads 688
14938 Standardized Description and Modeling Methods of Semiconductor IP Interfaces

Authors: Seongsoo Lee

Abstract:

IP reuse is an effective design methodology for modern SoC design to reduce effort and time. However, description and modeling methods of IP interfaces are different due to different IP designers. In this paper, standardized description and modeling methods of IP interfaces are proposed. It consists of 11 items such as IP information, model provision, data type, description level, interface information, port information, signal information, protocol information, modeling level, modeling information, and source file. The proposed description and modeling methods enables easy understanding, simulation, verification, and modification in IP reuse.

Keywords: interface, standardization, description, modeling, semiconductor IP

Procedia PDF Downloads 484
14937 Improved Processing Speed for Text Watermarking Algorithm in Color Images

Authors: Hamza A. Al-Sewadi, Akram N. A. Aldakari

Abstract:

Copyright protection and ownership proof of digital multimedia are achieved nowadays by digital watermarking techniques. A text watermarking algorithm for protecting the property rights and ownership judgment of color images is proposed in this paper. Embedding is achieved by inserting texts elements randomly into the color image as noise. The YIQ image processing model is found to be faster than other image processing methods, and hence, it is adopted for the embedding process. An optional choice of encrypting the text watermark before embedding is also suggested (in case required by some applications), where, the text can is encrypted using any enciphering technique adding more difficulty to hackers. Experiments resulted in embedding speed improvement of more than double the speed of other considered systems (such as least significant bit method, and separate color code methods), and a fairly acceptable level of peak signal to noise ratio (PSNR) with low mean square error values for watermarking purposes.

Keywords: steganography, watermarking, time complexity measurements, private keys

Procedia PDF Downloads 127
14936 Large-Scale Simulations of Turbulence Using Discontinuous Spectral Element Method

Authors: A. Peyvan, D. Li, J. Komperda, F. Mashayek

Abstract:

Turbulence can be observed in a variety fluid motions in nature and industrial applications. Recent investment in high-speed aircraft and propulsion systems has revitalized fundamental research on turbulent flows. In these systems, capturing chaotic fluid structures with different length and time scales is accomplished through the Direct Numerical Simulation (DNS) approach since it accurately simulates flows down to smallest dissipative scales, i.e., Kolmogorov’s scales. The discontinuous spectral element method (DSEM) is a high-order technique that uses spectral functions for approximating the solution. The DSEM code has been developed by our research group over the course of more than two decades. Recently, the code has been improved to run large cases in the order of billions of solution points. Running big simulations requires a considerable amount of RAM. Therefore, the DSEM code must be highly parallelized and able to start on multiple computational nodes on an HPC cluster with distributed memory. However, some pre-processing procedures, such as determining global element information, creating a global face list, and assigning global partitioning and element connection information of the domain for communication, must be done sequentially with a single processing core. A separate code has been written to perform the pre-processing procedures on a local machine. It stores the minimum amount of information that is required for the DSEM code to start in parallel, extracted from the mesh file, into text files (pre-files). It packs integer type information with a Stream Binary format in pre-files that are portable between machines. The files are generated to ensure fast read performance on different file-systems, such as Lustre and General Parallel File System (GPFS). A new subroutine has been added to the DSEM code to read the startup files using parallel MPI I/O, for Lustre, in a way that each MPI rank acquires its information from the file in parallel. In case of GPFS, in each computational node, a single MPI rank reads data from the file, which is specifically generated for the computational node, and send them to other ranks on the node using point to point non-blocking MPI communication. This way, communication takes place locally on each node and signals do not cross the switches of the cluster. The read subroutine has been tested on Argonne National Laboratory’s Mira (GPFS), National Center for Supercomputing Application’s Blue Waters (Lustre), San Diego Supercomputer Center’s Comet (Lustre), and UIC’s Extreme (Lustre). The tests showed that one file per node is suited for GPFS and parallel MPI I/O is the best choice for Lustre file system. The DSEM code relies on heavily optimized linear algebra operation such as matrix-matrix and matrix-vector products for calculation of the solution in every time-step. For this, the code can either make use of its matrix math library, BLAS, Intel MKL, or ATLAS. This fact and the discontinuous nature of the method makes the DSEM code run efficiently in parallel. The results of weak scaling tests performed on Blue Waters showed a scalable and efficient performance of the code in parallel computing.

Keywords: computational fluid dynamics, direct numerical simulation, spectral element, turbulent flow

Procedia PDF Downloads 121
14935 Major Constraints to Adoption of Improved Post-harvest Technologies among Smallholder Farmers in Developing Countries: A Systematic Review

Authors: Muganyizi Jonas Bisheko, G. Rejikumar

Abstract:

Reducing post-harvest losses could be a sustainable solution to enhance the food and income security of smallholder farmers in developing countries. While various research institutions have come up with a number of innovative post-harvest technologies for reducing post-harvest losses, most of them have not been extensively adopted by smallholder farmers. Despite this gap, the synthesized information about the major constraints of post-harvest technology is scarce. This study has been conducted to fill this gap and show the implications of the findings for future post-harvest research. The developed search strategy retrieved 2201 studies. However, after excluding duplicates, title, abstract and full article screening, a total of 41 documents were identified. The major findings are: (i) there is an outstanding deficiency of systematic evidence of the effect of climate change, off-farm income and sources of post-harvest information on the adoption of improved post-harvest technologies; (ii) there is very limited information on adoption constraints pertaining to matters of policy, rules and regulations; (iii) there is very thin literature on behavioral constraints associated with limited adoption of improved post-harvest technologies; (iv) most of the studies focused on post-harvest storage technologies (47%) followed by overall post-harvest management practices (25%), processing technologies (19%) and packaging technologies (3%). Much of the information was found on Cereals (58%), especially maize (44%); (v) geographically, Sub-Saharan Africa accounted for 79% of the reviewed interventions, while South Asia occupied only 21%. The findings of this review are intended to guide various post-harvest technologists and decision-makers in addressing the challenge of huge post-harvest losses.

Keywords: constraints, post-harvest loss, post-harvest technology , smallholder farmer

Procedia PDF Downloads 212
14934 Bacteriophage Lysis Of Physiologically Stressed Listeria Monocytogenes In A Simulated Seafood Processing Environment

Authors: Geevika J. Ganegama Arachchi, Steve H. Flint, Lynn McIntyre, Cristina D. Cruz, Beatrice M. Dias-Wanigasekera, Craig Billington, J. Andrew Hudson, Anthony N. Mutukumira

Abstract:

In seafood processing plants, Listeriamonocytogenes(L. monocytogenes)likely exists in a metabolically stressed state due to the nutrient-deficient environment, processing treatments such as heating, curing, drying, and freezing, and exposure to detergents and disinfectants. Stressed L. monocytogenes cells have been shown to be as pathogenic as unstressed cells. This study investigated lytic efficacy of (LiMN4L, LiMN4p, and LiMN17) which were previouslycharacterized as virulent against physiologically stressed cells of three seafood borne L. monocytogenesstrains (19CO9, 19DO3, and 19EO3).Physiologically compromised cells ofL. monocytogenesstrains were prepared by aging cultures in TrypticaseSoy Broth at 15±1°C for 72 h; heat injuringcultures at 54±1 - 55±1°C for 40 - 60 min;salt-stressing cultures in Milli-Q water were incubated at 25±1°C in darkness for three weeks; and incubating cultures in 9% (w/v) NaCl at 15±1°C for 72 h. Low concentrations of physiologically compromised cells of three L. monocytogenesstrainswere challenged in vitrowith high titre of three phages in separate experiments using Fish Broth medium (aqueous fish extract) at 15 °C in order to mimic the environment of seafood processing plant. Each phage, when present at ≈9 log10 PFU/ml, reduced late exponential phase cells of L. monocytogenes suspended in fish protein broth at ≈2-3 log10 CFU/ml to a non-detectable level (< 10 CFU/ml). Each phage, when present at ≈8.5 log10 PFU/ml, reduced both heat-injured cells present at 2.5-3.6 log10 CFU/ml and starved cells that were showed coccoid shape, present at ≈2-3 log10 CFU/ml to < 10 CFU/ml after 30 min. Phages also reduced salt-stressed cellspresent at ≈3 log10 CFU/ml by > 2 log10. L. monocytogenes (≈8 log10 CFU/ml) were reduced to below the detection limit (1 CFU/ml) by the three successive phage infections over 16 h, indicating that emergence of spontaneous phage resistance was infrequent. The three virulent phages showed high decontamination potential for physiologically stressed L. monocytogenes strains from seafood processing environments.

Keywords: physiologically stressed L. monocytogenes, heat injured, seafood processing environment, virulent phage

Procedia PDF Downloads 118
14933 Principles and Practice of Therapeutic Architecture

Authors: Umedov Mekhroz, Griaznova Svetlana

Abstract:

The quality of life and well-being of patients, staff and visitors are central to the delivery of health care. Architecture and design are becoming an integral part of the healing and recovery approach. The most significant point that can be implemented in hospital buildings is the therapeutic value of the artificial environment, the design and integration of plants to bring the natural world into the healthcare environment. The hospital environment should feel like home comfort. The techniques that therapeutic architecture uses are very cheap, but provide real benefit to patients, staff and visitors, demonstrating that the difference is not in cost but in design quality. The best environment is not necessarily more expensive - it is about special use of light and color, rational use of materials and flexibility of premises. All this forms innovative concepts in modern hospital architecture, in new construction, renovation or expansion projects. The aim of the study is to identify the methods and principles of therapeutic architecture. The research methodology consists in studying and summarizing international experience in scientific research, literature, standards, methodological manuals and project materials on the research topic. The result of the research is the development of graphic-analytical tables based on the system analysis of the processed information; 3d visualization of hospital interiors based on processed information.

Keywords: therapeutic architecture, healthcare interiors, sustainable design, materials, color scheme, lighting, environment.

Procedia PDF Downloads 104
14932 Limbic Involvement in Visual Processing

Authors: Deborah Zelinsky

Abstract:

The retina filters millions of incoming signals into a smaller amount of exiting optic nerve fibers that travel to different portions of the brain. Most of the signals are for eyesight (called "image-forming" signals). However, there are other faster signals that travel "elsewhere" and are not directly involved with eyesight (called "non-image-forming" signals). This article centers on the neurons of the optic nerve connecting to parts of the limbic system. Eye care providers are currently looking at parvocellular and magnocellular processing pathways without realizing that those are part of an enormous "galaxy" of all the body systems. Lenses are modifying both non-image and image-forming pathways, taking A.M. Skeffington's seminal work one step further. Almost 100 years ago, he described the Where am I (orientation), Where is It (localization), and What is It (identification) pathways. Now, among others, there is a How am I (animation) and a Who am I (inclination, motivation, imagination) pathway. Classic eye testing considers pupils and often assesses posture and motion awareness, but classical prescriptions often overlook limbic involvement in visual processing. The limbic system is composed of the hippocampus, amygdala, hypothalamus, and anterior nuclei of the thalamus. The optic nerve's limbic connections arise from the intrinsically photosensitive retinal ganglion cells (ipRGC) through the "retinohypothalamic tract" (RHT). There are two main hypothalamic nuclei with direct photic inputs. These are the suprachiasmatic nucleus and the paraventricular nucleus. Other hypothalamic nuclei connected with retinal function, including mood regulation, appetite, and glucose regulation, are the supraoptic nucleus and the arcuate nucleus. The retino-hypothalamic tract is often overlooked when we prescribe eyeglasses. Each person is different, but the lenses we choose are influencing this fast processing, which affects each patient's aiming and focusing abilities. These signals arise from the ipRGC cells that were only discovered 20+ years ago and do not address the campana retinal interneurons that were only discovered 2 years ago. As eyecare providers, we are unknowingly altering such factors as lymph flow, glucose metabolism, appetite, and sleep cycles in our patients. It is important to know what we are prescribing as the visual processing evaluations expand past the 20/20 central eyesight.

Keywords: neuromodulation, retinal processing, retinohypothalamic tract, limbic system, visual processing

Procedia PDF Downloads 68