Search results for: decision processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7467

Search results for: decision processing

5487 The Development of an Agent-Based Model to Support a Science-Based Evacuation and Shelter-in-Place Planning Process within the United States

Authors: Kyle Burke Pfeiffer, Carmella Burdi, Karen Marsh

Abstract:

The evacuation and shelter-in-place planning process employed by most jurisdictions within the United States is not informed by a scientifically-derived framework that is inclusive of the behavioral and policy-related indicators of public compliance with evacuation orders. While a significant body of work exists to define these indicators, the research findings have not been well-integrated nor translated into useable planning factors for public safety officials. Additionally, refinement of the planning factors alone is insufficient to support science-based evacuation planning as the behavioral elements of evacuees—even with consideration of policy-related indicators—must be examined in the context of specific regional transportation and shelter networks. To address this problem, the Federal Emergency Management Agency and Argonne National Laboratory developed an agent-based model to support regional analysis of zone-based evacuation in southeastern Georgia. In particular, this model allows public safety officials to analyze the consequences that a range of hazards may have upon a community, assess evacuation and shelter-in-place decisions in the context of specified evacuation and response plans, and predict outcomes based on community compliance with orders and the capacity of the regional (to include extra-jurisdictional) transportation and shelter networks. The intention is to use this model to aid evacuation planning and decision-making. Applications for the model include developing a science-driven risk communication strategy and, ultimately, in the case of evacuation, the shortest possible travel distance and clearance times for evacuees within the regional boundary conditions.

Keywords: agent-based modeling for evacuation, decision-support for evacuation planning, evacuation planning, human behavior in evacuation

Procedia PDF Downloads 235
5486 Challenges of Implementing Participatory Irrigation Management for Food Security in Semi Arid Areas of Tanzania

Authors: Pilly Joseph Kagosi

Abstract:

The study aims at assessing challenges observed during the implementation of participatory irrigation management (PIM) approach for food security in semi-arid areas of Tanzania. Data were collected through questionnaire, PRA tools, key informants discussion, Focus Group Discussion (FGD), participant observation, and literature review. Data collected from the questionnaire was analysed using SPSS while PRA data was analysed with the help of local communities during PRA exercise. Data from other methods were analysed using content analysis. The study revealed that PIM approach has a contribution in improved food security at household level due to the involvement of communities in water management activities and decision making which enhanced the availability of water for irrigation and increased crop production. However, there were challenges observed during the implementation of the approach including; minimum participation of beneficiaries in decision-making during planning and designing stages, meaning inadequate devolution of power among scheme owners. Inadequate and lack of transparency on income expenditure in Water Utilization Associations’ (WUAs), water conflict among WUAs members, conflict between farmers and livestock keepers and conflict between WUAs leaders and village government regarding training opportunities and status; WUAs rules and regulation are not legally recognized by the National court and few farmers involved in planting trees around water sources. However, it was realized that some of the mentioned challenges were rectified by farmers themselves facilitated by government officials. The study recommends that the identified challenges need to be rectified for farmers to realize impotence of PIM approach as it was realized by other Asian countries.

Keywords: challenges, participatory approach, irrigation management, food security, semi arid areas

Procedia PDF Downloads 324
5485 Maori Primary Industries Responses to Climate Change and Freshwater Policy Reforms in Aotearoa New Zealand

Authors: Tanira Kingi, Oscar Montes Oca, Reina Tamepo

Abstract:

The introduction of the Climate Change Response (Zero Carbon) Amendment Act (2019) and the National Policy Statement for Freshwater Management (2020) both contain underpinning statements that refer to the principles of the Treaty of Waitangi and cultural concepts of stewardship and environmental protection. Maori interests in New Zealand’s agricultural, forestry, fishing and horticultural sectors are significant. The organizations that manage these investments do so on behalf of extended family groups that hold inherited interests based on genealogical connections (whakapapa) to particular tribal units (iwi and hapu) and areas of land (whenua) and freshwater bodies (wai). This paper draws on the findings of current research programmes funded by the New Zealand Agricultural Greenhouse Gas Research Centre (NZAGRC) and the Our Land & Water National Science Challenge (OLW NSC) to understand the impact of cultural knowledge and imperatives on agricultural GHG and freshwater mitigation and land-use change decisions. In particular, the research outlines mitigation and land-use change scenario decision support frameworks that model changes in emissions profiles (reductions in biogenic methane, nitrous oxide and nutrient emissions to freshwater) of agricultural and forestry production systems along with impacts on key economic indicators and socio-cultural factors. The paper also assesses the effectiveness of newly introduced partnership arrangements between Maori groups/organizations and key government agencies on policy co-design and implementation, and in particular, decisions to adopt mitigation practices and to diversify land use.

Keywords: co-design and implementation of environmental policy, indigenous environmental knowledge, Māori land tenure and agribusiness, mitigation and land use change decision support frameworks

Procedia PDF Downloads 215
5484 An Analytical Systematic Design Approach to Evaluate Ballistic Performance of Armour Grade AA7075 Aluminium Alloy Using Friction Stir Processing

Authors: Lahari Ramya Pa, Sudhakar Ib, Madhu Vc, Madhusudhan Reddy Gd, Srinivasa Rao E.

Abstract:

Selection of suitable armor materials for defense applications is very crucial with respect to increasing mobility of the systems as well as maintaining safety. Therefore, determining the material with the lowest possible areal density that resists the predefined threat successfully is required in armor design studies. A number of light metal and alloys are come in to forefront especially to substitute the armour grade steels. AA5083 aluminium alloy which fit in to the military standards imposed by USA army is foremost nonferrous alloy to consider for possible replacement of steel to increase the mobility of armour vehicles and enhance fuel economy. Growing need of AA5083 aluminium alloy paves a way to develop supplement aluminium alloys maintaining the military standards. It has been witnessed that AA 2xxx aluminium alloy, AA6xxx aluminium alloy and AA7xxx aluminium alloy are the potential material to supplement AA5083 aluminium alloy. Among those cited aluminium series alloys AA7xxx aluminium alloy (heat treatable) possesses high strength and can compete with armour grade steels. Earlier investigations revealed that layering of AA7xxx aluminium alloy can prevent spalling of rear portion of armour during ballistic impacts. Hence, present investigation deals with fabrication of hard layer (made of boron carbide) i.e. layer on AA 7075 aluminium alloy using friction stir processing with an intention of blunting the projectile in the initial impact and backing tough portion(AA7xxx aluminium alloy) to dissipate residual kinetic energy. An analytical approach has been adopted to unfold the ballistic performance of projectile. Penetration of projectile inside the armour has been resolved by considering by strain energy model analysis. Perforation shearing areas i.e. interface of projectile and armour is taken in to account for evaluation of penetration inside the armour. Fabricated surface composites (targets) were tested as per the military standard (JIS.0108.01) in a ballistic testing tunnel at Defence Metallurgical Research Laboratory (DMRL), Hyderabad in standardized testing conditions. Analytical results were well validated with experimental obtained one.

Keywords: AA7075 aluminium alloy, friction stir processing, boron carbide, ballistic performance, target

Procedia PDF Downloads 330
5483 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: time-series, features engineering methods for forecasting, energy demand forecasting, Azure Machine Learning

Procedia PDF Downloads 298
5482 The Role of Metaphor in Communication

Authors: Fleura Shkëmbi, Valbona Treska

Abstract:

In elementary school, we discover that a metaphor is a decorative linguistic device just for poets. But now that we know, it's also a crucial tactic that individuals employ to understand the universe, from fundamental ideas like time and causation to the most pressing societal challenges today. Metaphor is the use of language to refer to something other than what it was originally intended for or what it "literally" means in order to suggest a similarity or establish a connection between the two. People do not identify metaphors as relevant in their decisions, according to a study on metaphor and its effect on decision-making; instead, they refer to more "substantive" (typically numerical) facts as the basis for their problem-solving decision. Every day, metaphors saturate our lives via language, cognition, and action. They argue that our conceptions shape our views and interactions with others and that concepts define our reality. Metaphor is thus a highly helpful tool for both describing our experiences to others and forming notions for ourselves. In therapeutic contexts, their shared goal appears to be twofold. The cognitivist approach to metaphor regards it as one of the fundamental foundations of human communication. The benefits and disadvantages of utilizing the metaphor differ depending on the target domain that the metaphor portrays. The challenge of creating messages and surroundings that affect customers' notions of abstract ideas in a variety of industries, including health, hospitality, romance, and money, has been studied for decades in marketing and consumer psychology. The aim of this study is to examine, through a systematic literature review, the role of the metaphor in communication and in advertising. This study offers a selected analysis of this literature, concentrating on research on customer attitudes and product appraisal. The analysis of the data identifies potential research questions. With theoretical and applied implications for marketing, design, and persuasion, this study sheds light on how, when, and for whom metaphoric communications are powerful.

Keywords: metaphor, communication, advertising, cognition, action

Procedia PDF Downloads 99
5481 Balanced Scorecard (BSC) Project : A Methodological Proposal for Decision Support in a Corporate Scenario

Authors: David de Oliveira Costa, Miguel Ângelo Lellis Moreira, Carlos Francisco Simões Gomes, Daniel Augusto de Moura Pereira, Marcos dos Santos

Abstract:

Strategic management is a fundamental process for global companies that intend to remain competitive in an increasingly dynamic and complex market. To do so, it is necessary to maintain alignment with their principles and values. The Balanced Scorecard (BSC) proposes to ensure that the overall business performance is based on different perspectives (financial, customer, internal processes, and learning and growth). However, relying solely on the BSC may not be enough to ensure the success of strategic management. It is essential that companies also evaluate and prioritize strategic projects that need to be implemented to ensure they are aligned with the business vision and contribute to achieving established goals and objectives. In this context, the proposition involves the incorporation of the SAPEVO-M multicriteria method to indicate the degree of relevance between different perspectives. Thus, the strategic objectives linked to these perspectives have greater weight in the classification of structural projects. Additionally, it is proposed to apply the concept of the Impact & Probability Matrix (I&PM) to structure and ensure that strategic projects are evaluated according to their relevance and impact on the business. By structuring the business's strategic management in this way, alignment and prioritization of projects and actions related to strategic planning are ensured. This ensures that resources are directed towards the most relevant and impactful initiatives. Therefore, the objective of this article is to present the proposal for integrating the BSC methodology, the SAPEVO-M multicriteria method, and the prioritization matrix to establish a concrete weighting of strategic planning and obtain coherence in defining strategic projects aligned with the business vision. This ensures a robust decision-making support process.

Keywords: MCDA process, prioritization problematic, corporate strategy, multicriteria method

Procedia PDF Downloads 81
5480 Multiple Version of Roman Domination in Graphs

Authors: J. C. Valenzuela-Tripodoro, P. Álvarez-Ruíz, M. A. Mateos-Camacho, M. Cera

Abstract:

In 2004, it was introduced the concept of Roman domination in graphs. This concept was initially inspired and related to the defensive strategy of the Roman Empire. An undefended place is a city so that no legions are established on it, whereas a strong place is a city in which two legions are deployed. This situation may be modeled by labeling the vertices of a finite simple graph with labels {0, 1, 2}, satisfying the condition that any 0-vertex must be adjacent to, at least, a 2-vertex. Roman domination in graphs is a variant of classic domination. Clearly, the main aim is to obtain such labeling of the vertices of the graph with minimum cost, that is to say, having minimum weight (sum of all vertex labels). Formally, a function f: V (G) → {0, 1, 2} is a Roman dominating function (RDF) in the graph G = (V, E) if f(u) = 0 implies that f(v) = 2 for, at least, a vertex v which is adjacent to u. The weight of an RDF is the positive integer w(f)= ∑_(v∈V)▒〖f(v)〗. The Roman domination number, γ_R (G), is the minimum weight among all the Roman dominating functions? Obviously, the set of vertices with a positive label under an RDF f is a dominating set in the graph, and hence γ(G)≤γ_R (G). In this work, we start the study of a generalization of RDF in which we consider that any undefended place should be defended from a sudden attack by, at least, k legions. These legions can be deployed in the city or in any of its neighbours. A function f: V → {0, 1, . . . , k + 1} such that f(N[u]) ≥ k + |AN(u)| for all vertex u with f(u) < k, where AN(u) represents the set of active neighbours (i.e., with a positive label) of vertex u, is called a [k]-multiple Roman dominating functions and it is denoted by [k]-MRDF. The minimum weight of a [k]-MRDF in the graph G is the [k]-multiple Roman domination number ([k]-MRDN) of G, denoted by γ_[kR] (G). First, we prove that the [k]-multiple Roman domination decision problem is NP-complete even when restricted to bipartite and chordal graphs. A problem that had been resolved for other variants and wanted to be generalized. We know the difficulty of calculating the exact value of the [k]-MRD number, even for families of particular graphs. Here, we present several upper and lower bounds for the [k]-MRD number that permits us to estimate it with as much precision as possible. Finally, some graphs with the exact value of this parameter are characterized.

Keywords: multiple roman domination function, decision problem np-complete, bounds, exact values

Procedia PDF Downloads 108
5479 Sustainable Radiation Curable Palm Oil-Based Products for Advanced Materials Applications

Authors: R. Tajau, R. Rohani, M. S. Alias, N. H. Mudri, K. A. Abdul Halim, M. H. Harun, N. Mat Isa, R. Che Ismail, S. Muhammad Faisal, M. Talib, M. R. Mohamed Zin

Abstract:

Bio-based polymeric materials are increasingly used for a variety of applications, including surface coating, drug delivery systems, and tissue engineering. These polymeric materials are ideal for the aforementioned applications because they are derived from natural resources, non-toxic, low-cost, biocompatible, and biodegradable, and have promising thermal and mechanical properties. The nature of hydrocarbon chains, carbon double bonds, and ester bonds allows various sources of oil (edible), such as soy, sunflower, olive, and oil palm, to fine-tune their particular structures in the development of innovative materials. Palm oil can be the most eminent raw material used for manufacturing new and advanced natural polymeric materials involving radiation techniques, such as coating resins, nanoparticles, scaffold, nanotubes, nanocomposites, and lithography for different branches of the industry in countries where oil palm is abundant. The radiation technique is among the most versatile, cost-effective, simple, and effective methods. Crosslinking, reversible addition-fragmentation chain transfer (RAFT), polymerisation, grafting, and degradation are among the radiation mechanisms. Exposure to gamma, EB, UV, or laser irradiation, which are commonly used in the development of polymeric materials, is used in these mechanisms. Therefore, this review focuses on current radiation processing technologies for the development of various radiation-curable bio-based polymeric materials with a promising future in biomedical and industrial applications. The key focus of this review is on radiation curable palm oil-based products, which have been published frequently in recent studies.

Keywords: palm oil, radiation processing, surface coatings, VOC

Procedia PDF Downloads 183
5478 Optimization of Extraction Conditions and Characteristics of Scale collagen From Sardine: Sardina pilchardus

Authors: F. Bellali, M. Kharroubi, M. Loutfi, N.Bourhim

Abstract:

In Morocco, fish processing industry is an important source income for a large amount of byproducts including skins, bones, heads, guts and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Scales from Sardina plichardus resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic and bio medical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. Moreover, the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The basic principle of RSM is to determinate model equations that describe interrelations between the independent variables and the dependent variables.

Keywords: Sardina pilchardus, scales, valorization, collagen extraction, response surface methodology

Procedia PDF Downloads 417
5477 Microbial Dynamics and Sensory Traits of Spanish- and Greek-Style Table Olives (Olea europaea L. cv. Ascolana tenera) Fermented with Sea Fennel (Crithmum maritimum L.)

Authors: Antonietta Maoloni, Federica Cardinali, Vesna Milanović, Andrea Osimani, Ilario Ferrocino, Maria Rita Corvaglia, Luca Cocolin, Lucia Aquilanti

Abstract:

Table olives (Olea europaea L.) are among the most important fermented vegetables all over the world, while sea fennel (Crithmum maritimum L.) is an emerging food crop with interesting nutritional and sensory traits. Both of them are characterized by the presence of several bioactive compounds with potential beneficial health effects, thus representing two valuable substrates for the manufacture of innovative vegetable-based preserves. Given these premises, the present study was aimed at exploring the co-fermentation of table olives and sea fennel to produce new high-value preserves. Spanish style or Greek style processing method and the use of a multiple strain starter were explored. The preserves were evaluated for their microbial dynamics and key sensory traits. During the fermentation, a progressive pH reduction was observed. Mesophilic lactobacilli, mesophilic lactococci, and yeasts were the main microbial groups at the end of the fermentation, whereas Enterobacteriaceae decreased during fermentation. An evolution of the microbiota was revealed by metataxonomic analysis, with Lactiplantibacillus plantarum dominating in the late stage of fermentation, irrespective of processing method and use of the starter. Greek style preserves resulted in more crunchy and less fibrous than Spanish style one and were preferred by trained panelists.

Keywords: lactic acid bacteria, Lactiplantibacillus plantarum, metataxonomy, panel test, rock samphire

Procedia PDF Downloads 129
5476 Coupled Effect of Pulsed Current and Stress State on Fracture Behavior of Ultrathin Superalloy Sheet

Authors: Shuangxin Wu

Abstract:

Superalloy ultra-thin-walled components occupy a considerable proportion of aero engines and play an increasingly important role in structural weight reduction and performance improvement. To solve problems such as high deformation resistance and poor formability at room temperature, the introduction of pulse current in the processing process can improve the plasticity of metal materials, but the influence mechanism of pulse current on the forming limit of superalloy ultra-thin sheet is not clear, which is of great significance for determining the material processing window and improving the micro-forming process. The effect of pulse current on the microstructure evolution of superalloy thin plates was observed by optical microscopy (OM) and X-ray diffraction topography (XRT) by applying pulse current to GH3039 with a thickness of 0.2mm under plane strain and uniaxial tensile states. Compared with the specimen without pulse current applied at the same temperature, the internal void volume fraction is significantly reduced, reflecting the non-thermal effect of pulse current on the growth of micro-pores. ED (electrically deforming) specimens have larger and deeper dimples, but the elongation is not significantly improved because the pulse current promotes the void coalescence process, resulting in material fracture. The electro-plastic phenomenon is more obvious in the plane strain state, which is closely related to the effect of stress triaxial degree on the void evolution under pulsed current.

Keywords: pulse current, superalloy, ductile fracture, void damage

Procedia PDF Downloads 72
5475 Image Recognition Performance Benchmarking for Edge Computing Using Small Visual Processing Unit

Authors: Kasidis Chomrat, Nopasit Chakpitak, Anukul Tamprasirt, Annop Thananchana

Abstract:

Internet of Things devices or IoT and Edge Computing has become one of the biggest things happening in innovations and one of the most discussed of the potential to improve and disrupt traditional business and industry alike. With rises of new hang cliff challenges like COVID-19 pandemic that posed a danger to workforce and business process of the system. Along with drastically changing landscape in business that left ruined aftermath of global COVID-19 pandemic, looming with the threat of global energy crisis, global warming, more heating global politic that posed a threat to become new Cold War. How emerging technology like edge computing and usage of specialized design visual processing units will be great opportunities for business. The literature reviewed on how the internet of things and disruptive wave will affect business, which explains is how all these new events is an effect on the current business and how would the business need to be adapting to change in the market and world, and example test benchmarking for consumer marketed of newer devices like the internet of things devices equipped with new edge computing devices will be increase efficiency and reducing posing a risk from a current and looming crisis. Throughout the whole paper, we will explain the technologies that lead the present technologies and the current situation why these technologies will be innovations that change the traditional practice through brief introductions to the technologies such as cloud computing, edge computing, Internet of Things and how it will be leading into future.

Keywords: internet of things, edge computing, machine learning, pattern recognition, image classification

Procedia PDF Downloads 156
5474 Urban Growth and Its Impact on Natural Environment: A Geospatial Analysis of North Part of the UAE

Authors: Mohamed Bualhamam

Abstract:

Due to the complex nature of tourism resources of the Northern part of the United Arab Emirates (UAE), the potential of Geographical Information Systems (GIS) and Remote Sensing (RS) in resolving these issues was used. The study was an attempt to use existing GIS data layers to identify sensitive natural environment and archaeological heritage resources that may be threatened by increased urban growth and give some specific recommendations to protect the area. By identifying sensitive natural environment and archaeological heritage resources, public agencies and citizens are in a better position to successfully protect important natural lands and direct growth away from environmentally sensitive areas. The paper concludes that applications of GIS and RS in study of urban growth impact in tourism resources are a strong and effective tool that can aid in tourism planning and decision-making. The study area is one of the fastest growing regions in the country. The increase in population along the region, as well as rapid growth of towns, has increased the threat to natural resources and archeological sites. Satellite remote sensing data have been proven useful in assessing the natural resources and in monitoring the changes. The study used GIS and RS to identify sensitive natural environment and archaeological heritage resources that may be threatened by increased urban growth. The result of GIS analyses shows that the Northern part of the UAE has variety for tourism resources, which can use for future tourism development. Rapid urban development in the form of small towns and different economic activities are showing in different places in the study area. The urban development extended out of old towns and have negative affected of sensitive tourism resources in some areas. Tourism resources for the Northern part of the UAE is a highly complex resources, and thus requires tools that aid in effective decision making to come to terms with the competing economic, social, and environmental demands of sustainable development. The UAE government should prepare a tourism databases and a GIS system, so that planners can be accessed for archaeological heritage information as part of development planning processes. Applications of GIS in urban planning, tourism and recreation planning illustrate that GIS is a strong and effective tool that can aid in tourism planning and decision- making. The power of GIS lies not only in the ability to visualize spatial relationships, but also beyond the space to a holistic view of the world with its many interconnected components and complex relationships. The worst of the damage could have been avoided by recognizing suitable limits and adhering to some simple environmental guidelines and standards will successfully develop tourism in sustainable manner. By identifying sensitive natural environment and archaeological heritage resources of the Northern part of the UAE, public agencies and private citizens are in a better position to successfully protect important natural lands and direct growth away from environmentally sensitive areas.

Keywords: GIS, natural environment, UAE, urban growth

Procedia PDF Downloads 262
5473 Effect of Fermentation Time on Some Functional Properties of Moringa (Moringa oleifera) Seed Flour

Authors: Ocheme B. Ocheme, Omobolanle O. Oloyede, S. James, Eleojo V. Akpa

Abstract:

The effect of fermentation time on some functional properties of Moringa (Moringa oleifera) seed flour was examined. Fermentation, an effective processing method used to improve nutritional quality of plant foods, tends to affect the characteristics of food components and their behaviour in food systems just like other processing methods. Hence the need for this study. Moringa seeds were fermented naturally by soaking in potable water and allowing it to stand for 12, 24, 48 and 72 hours. At the end of fermentation, the seeds were oven dried at 600C for 12 hours and then milled into flour. Flour obtained from unfermented seeds served as control: hence a total of five flour samples. The functional properties were analyzed using standard methods. Fermentation significantly (p<0.05) increased the water holding capacity of Moringa seed flour from 0.86g/g - 2.31g/g. The highest value was observed after 48 hours of fermentation The same trend was observed for oil absorption capacity with values between 0.87 and 1.91g/g. Flour from unfermented Moringa seeds had a bulk density of 0.60g/cm3 which was significantly (p<0.05) higher than the bulk densities of flours from seeds fermented for 12, 24 and 48. Fermentation significantly (p<0.05) decreased the dispersibility of Moringa seed flours from 36% to 21, 24, 29 and 20% after 12, 24, 48 and 72 hours of fermentation respectively. The flours’ emulsifying capacities increased significantly (p<0.05) with increasing fermentation time with values between 50 – 68%. The flour obtained from seeds fermented for 12 hours had a significantly (p<0.05) higher foaming capacity of 16% while the flour obtained from seeds fermented for 0, 24 and 72 hours had the least foaming capacities of 9%. Flours from seeds fermented for 12 and 48 hours had better functional properties than flours from seeds fermented for 24 and 72 hours.

Keywords: fermentation, flour, functional properties, Moringa

Procedia PDF Downloads 688
5472 Agent-Based Modelling to Improve Dairy-origin Beef Production: Model Description and Evaluation

Authors: Addisu H. Addis, Hugh T. Blair, Paul R. Kenyon, Stephen T. Morris, Nicola M. Schreurs, Dorian J. Garrick

Abstract:

Agent-based modeling (ABM) enables an in silico representation of complex systems and cap-tures agent behavior resulting from interaction with other agents and their environment. This study developed an ABM to represent a pasture-based beef cattle finishing systems in New Zea-land (NZ) using attributes of the rearer, finisher, and processor, as well as specific attributes of dairy-origin beef cattle. The model was parameterized using values representing 1% of NZ dairy-origin cattle, and 10% of rearers and finishers in NZ. The cattle agent consisted of 32% Holstein-Friesian, 50% Holstein-Friesian–Jersey crossbred, and 8% Jersey, with the remainder being other breeds. Rearers and finishers repetitively and simultaneously interacted to determine the type and number of cattle populating the finishing system. Rearers brought in four-day-old spring-born calves and reared them until 60 calves (representing a full truck load) on average had a live weight of 100 kg before selling them on to finishers. Finishers mainly attained weaners from rearers, or directly from dairy farmers when weaner demand was higher than the supply from rearers. Fast-growing cattle were sent for slaughter before the second winter, and the re-mainder were sent before their third winter. The model finished a higher number of bulls than heifers and steers, although it was 4% lower than the industry reported value. Holstein-Friesian and Holstein-Friesian–Jersey-crossbred cattle dominated the dairy-origin beef finishing system. Jersey cattle account for less than 5% of total processed beef cattle. Further studies to include re-tailer and consumer perspectives and other decision alternatives for finishing farms would im-prove the applicability of the model for decision-making processes.

Keywords: agent-based modelling, dairy cattle, beef finishing, rearers, finishers

Procedia PDF Downloads 99
5471 From Pink to Ink: Understanding the Decision-Making Process of Post-mastectomy Women Who Have Covered Their Scars with Decorative Tattoos

Authors: Fernanda Rodriguez

Abstract:

Breast cancer is pervasive among women, and an increasing number of women are opting for a mastectomy: a medical operation in which one or both breasts are removed with the intention of treating or averting breast cancer. However, there is an emerging population of cancer survivors in European nations that, rather than attempting to reconstruct their breasts to resemble as much as possible ‘normal’ breasts, have turned to dress their scars with decorative tattoos. At a practical level, this study hopes to improve the support systems of these women by possibly providing professionals in the medical field, tattoo artists, and family members of cancer survivors with a deeper understanding of their motivations and decision-making processes for choosing an alternative restorative route - such as decorative tattoos - after their mastectomy. At an intellectual level, however, this study aims to narrow a gap in the academic field concerning the relationship between mastectomies and alternative methods of healing, such as decorative tattoos, as well as to broaden the understanding regarding meaning-making and the ‘normal’ feminine body. Thus, by means of semi-structured interviews and a phenomenological standpoint, this research set itself the goal to understand why do women who have undergone a mastectomy choose to dress their scars with decorative tattoos instead of attempting to regain ‘normalcy’ through breast reconstruction or 3D areola tattoos? The results obtained from the interviews with fifteen women showed that the disillusionment with one part of the other of breast restoration techniques had led these women to find an alternative form of healing that allows them not only to close a painful chapter of their life but also to regain control over their bodies after a period of time in which agency was taking away from them. Decorative post-mastectomy tattoos allow these women to grant their bodies with new meanings and produce their own interpretation of their feminine body and identity.

Keywords: alternative femininity, decorative mastectomy tattoos, gender embodiment, social stigmatization

Procedia PDF Downloads 120
5470 Distributed Cost-Based Scheduling in Cloud Computing Environment

Authors: Rupali, Anil Kumar Jaiswal

Abstract:

Cloud computing can be defined as one of the prominent technologies that lets a user change, configure and access the services online. it can be said that this is a prototype of computing that helps in saving cost and time of a user practically the use of cloud computing can be found in various fields like education, health, banking etc.  Cloud computing is an internet dependent technology thus it is the major responsibility of Cloud Service Providers(CSPs) to care of data stored by user at data centers. Scheduling in cloud computing environment plays a vital role as to achieve maximum utilization and user satisfaction cloud providers need to schedule resources effectively.  Job scheduling for cloud computing is analyzed in the following work. To complete, recreate the task calculation, and conveyed scheduling methods CloudSim3.0.3 is utilized. This research work discusses the job scheduling for circulated processing condition also by exploring on this issue we find it works with minimum time and less cost. In this work two load balancing techniques have been employed: ‘Throttled stack adjustment policy’ and ‘Active VM load balancing policy’ with two brokerage services ‘Advanced Response Time’ and ‘Reconfigure Dynamically’ to evaluate the VM_Cost, DC_Cost, Response Time, and Data Processing Time. The proposed techniques are compared with Round Robin scheduling policy.

Keywords: physical machines, virtual machines, support for repetition, self-healing, highly scalable programming model

Procedia PDF Downloads 168
5469 Local Binary Patterns-Based Statistical Data Analysis for Accurate Soccer Match Prediction

Authors: Mohammad Ghahramani, Fahimeh Saei Manesh

Abstract:

Winning a soccer game is based on thorough and deep analysis of the ongoing match. On the other hand, giant gambling companies are in vital need of such analysis to reduce their loss against their customers. In this research work, we perform deep, real-time analysis on every soccer match around the world that distinguishes our work from others by focusing on particular seasons, teams and partial analytics. Our contributions are presented in the platform called “Analyst Masters.” First, we introduce various sources of information available for soccer analysis for teams around the world that helped us record live statistical data and information from more than 50,000 soccer matches a year. Our second and main contribution is to introduce our proposed in-play performance evaluation. The third contribution is developing new features from stable soccer matches. The statistics of soccer matches and their odds before and in-play are considered in the image format versus time including the halftime. Local Binary patterns, (LBP) is then employed to extract features from the image. Our analyses reveal incredibly interesting features and rules if a soccer match has reached enough stability. For example, our “8-minute rule” implies if 'Team A' scores a goal and can maintain the result for at least 8 minutes then the match would end in their favor in a stable match. We could also make accurate predictions before the match of scoring less/more than 2.5 goals. We benefit from the Gradient Boosting Trees, GBT, to extract highly related features. Once the features are selected from this pool of data, the Decision trees decide if the match is stable. A stable match is then passed to a post-processing stage to check its properties such as betters’ and punters’ behavior and its statistical data to issue the prediction. The proposed method was trained using 140,000 soccer matches and tested on more than 100,000 samples achieving 98% accuracy to select stable matches. Our database from 240,000 matches shows that one can get over 20% betting profit per month using Analyst Masters. Such consistent profit outperforms human experts and shows the inefficiency of the betting market. Top soccer tipsters achieve 50% accuracy and 8% monthly profit in average only on regional matches. Both our collected database of more than 240,000 soccer matches from 2012 and our algorithm would greatly benefit coaches and punters to get accurate analysis.

Keywords: soccer, analytics, machine learning, database

Procedia PDF Downloads 238
5468 Application of Environmental Justice Concept in Urban Planning, The Peri-Urban Environment of Tehran as the Case Study

Authors: Zahra Khodaee

Abstract:

Environmental Justice (EJ) concept consists of multifaceted movements, community struggles, and discourses in contemporary societies that seek to reduce environmental risks, increase environmental protections, and generally reduce environmental inequalities suffered by minority and poor communities; a term that incorporates ‘environmental racism’ and ‘environmental classism,’ captures the idea that different racial and socioeconomic groups experience differential access to environmental quality. This article explores environmental justice as an urban phenomenon in urban planning and applies it in peri-urban environment of a metropolis. Tehran peri-urban environments which are the result of meeting the city- village- nature systems or «city-village junction» have gradually faced effects such as accelerated environmental decline, changes without land-use plan, and severe service deficiencies. These problems are instances of environmental injustice which make the planners to adjust the problems and use and apply the appropriate strategies and policies by looking for solutions and resorting to theories, techniques and methods related to environmental justice. In order to access to this goal, try to define environmental justice through justice and determining environmental justice indices to analysis environmental injustice in case study. Then, make an effort to introduce some criteria to select case study in two micro and micro levels. Qiyamdasht town as the peri-urban environment of Tehran metropolis is chosen and examined to show the existence of environmental injustice by questionnaire analysis and SPSS software. Finally, use AIDA technique to design a strategic plan and reduce environmental injustice in case study by introducing the better scenario to be used in policy and decision making areas.

Keywords: environmental justice, metropolis of Tehran, Qiyam, Dasht peri, urban settlement, analysis of interconnected decision area (AIDA)

Procedia PDF Downloads 491
5467 Application of a Model-Free Artificial Neural Networks Approach for Structural Health Monitoring of the Old Lidingö Bridge

Authors: Ana Neves, John Leander, Ignacio Gonzalez, Raid Karoumi

Abstract:

Systematic monitoring and inspection are needed to assess the present state of a structure and predict its future condition. If an irregularity is noticed, repair actions may take place and the adequate intervention will most probably reduce the future costs with maintenance, minimize downtime and increase safety by avoiding the failure of the structure as a whole or of one of its structural parts. For this to be possible decisions must be made at the right time, which implies using systems that can detect abnormalities in their early stage. In this sense, Structural Health Monitoring (SHM) is seen as an effective tool for improving the safety and reliability of infrastructures. This paper explores the decision-making problem in SHM regarding the maintenance of civil engineering structures. The aim is to assess the present condition of a bridge based exclusively on measurements using the suggested method in this paper, such that action is taken coherently with the information made available by the monitoring system. Artificial Neural Networks are trained and their ability to predict structural behavior is evaluated in the light of a case study where acceleration measurements are acquired from a bridge located in Stockholm, Sweden. This relatively old bridge is presently still in operation despite experiencing obvious problems already reported in previous inspections. The prediction errors provide a measure of the accuracy of the algorithm and are subjected to further investigation, which comprises concepts like clustering analysis and statistical hypothesis testing. These enable to interpret the obtained prediction errors, draw conclusions about the state of the structure and thus support decision making regarding its maintenance.

Keywords: artificial neural networks, clustering analysis, model-free damage detection, statistical hypothesis testing, structural health monitoring

Procedia PDF Downloads 209
5466 Pulmonary Disease Identification Using Machine Learning and Deep Learning Techniques

Authors: Chandu Rathnayake, Isuri Anuradha

Abstract:

Early detection and accurate diagnosis of lung diseases play a crucial role in improving patient prognosis. However, conventional diagnostic methods heavily rely on subjective symptom assessments and medical imaging, often causing delays in diagnosis and treatment. To overcome this challenge, we propose a novel lung disease prediction system that integrates patient symptoms and X-ray images to provide a comprehensive and reliable diagnosis.In this project, develop a mobile application specifically designed for detecting lung diseases. Our application leverages both patient symptoms and X-ray images to facilitate diagnosis. By combining these two sources of information, our application delivers a more accurate and comprehensive assessment of the patient's condition, minimizing the risk of misdiagnosis. Our primary aim is to create a user-friendly and accessible tool, particularly important given the current circumstances where many patients face limitations in visiting healthcare facilities. To achieve this, we employ several state-of-the-art algorithms. Firstly, the Decision Tree algorithm is utilized for efficient symptom-based classification. It analyzes patient symptoms and creates a tree-like model to predict the presence of specific lung diseases. Secondly, we employ the Random Forest algorithm, which enhances predictive power by aggregating multiple decision trees. This ensemble technique improves the accuracy and robustness of the diagnosis. Furthermore, we incorporate a deep learning model using Convolutional Neural Network (CNN) with the RestNet50 pre-trained model. CNNs are well-suited for image analysis and feature extraction. By training CNN on a large dataset of X-ray images, it learns to identify patterns and features indicative of lung diseases. The RestNet50 architecture, known for its excellent performance in image recognition tasks, enhances the efficiency and accuracy of our deep learning model. By combining the outputs of the decision tree-based algorithms and the deep learning model, our mobile application generates a comprehensive lung disease prediction. The application provides users with an intuitive interface to input their symptoms and upload X-ray images for analysis. The prediction generated by the system offers valuable insights into the likelihood of various lung diseases, enabling individuals to take appropriate actions and seek timely medical attention. Our proposed mobile application has significant potential to address the rising prevalence of lung diseases, particularly among young individuals with smoking addictions. By providing a quick and user-friendly approach to assessing lung health, our application empowers individuals to monitor their well-being conveniently. This solution also offers immense value in the context of limited access to healthcare facilities, enabling timely detection and intervention. In conclusion, our research presents a comprehensive lung disease prediction system that combines patient symptoms and X-ray images using advanced algorithms. By developing a mobile application, we provide an accessible tool for individuals to assess their lung health conveniently. This solution has the potential to make a significant impact on the early detection and management of lung diseases, benefiting both patients and healthcare providers.

Keywords: CNN, random forest, decision tree, machine learning, deep learning

Procedia PDF Downloads 73
5465 A Hebbian Neural Network Model of the Stroop Effect

Authors: Vadim Kulikov

Abstract:

The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.

Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop

Procedia PDF Downloads 267
5464 Easy Way of Optimal Process-Storage Network Design

Authors: Gyeongbeom Yi

Abstract:

The purpose of this study is to introduce the analytic solution for determining the optimal capacity (lot-size) of a multiproduct, multistage production and inventory system to meet the finished product demand. Reasonable decision-making about the capacity of processes and storage units is an important subject for industry. The industrial solution for this subject is to use the classical economic lot sizing method, EOQ/EPQ (Economic Order Quantity/Economic Production Quantity) model, incorporated with practical experience. However, the unrealistic material flow assumption of the EOQ/EPQ model is not suitable for chemical plant design with highly interlinked processes and storage units. This study overcomes the limitation of the classical lot sizing method developed on the basis of the single product and single stage assumption. The superstructure of the plant considered consists of a network of serially and/or parallelly interlinked processes and storage units. The processes involve chemical reactions with multiple feedstock materials and multiple products as well as mixing, splitting or transportation of materials. The objective function for optimization is minimizing the total cost composed of setup and inventory holding costs as well as the capital costs of constructing processes and storage units. A novel production and inventory analysis method, PSW (Periodic Square Wave) model, is applied. The advantage of the PSW model comes from the fact that the model provides a set of simple analytic solutions in spite of a realistic description of the material flow between processes and storage units. The resulting simple analytic solution can greatly enhance the proper and quick investment decision for plant design and operation problem confronted in diverse economic situations.

Keywords: analytic solution, optimal design, process-storage network

Procedia PDF Downloads 331
5463 Quantification Model for Capability Evaluation of Optical-Based in-Situ Monitoring System for Laser Powder Bed Fusion (LPBF) Process

Authors: Song Zhang, Hui Wang, Johannes Henrich Schleifenbaum

Abstract:

Due to the increasing demand for quality assurance and reliability for additive manufacturing, the development of an advanced in-situ monitoring system is required to monitor the process anomalies as input for further process control. Optical-based monitoring systems, such as CMOS cameras and NIR cameras, are proved as effective ways to monitor the geometrical distortion and exceptional thermal distribution. Therefore, many studies and applications are focusing on the availability of the optical-based monitoring system for detecting varied types of defects. However, the capability of the monitoring setup is not quantified. In this study, a quantification model to evaluate the capability of the monitoring setups for the LPBF machine based on acquired monitoring data of a designed test artifact is presented, while the design of the relevant test artifacts is discussed. The monitoring setup is evaluated based on its hardware properties, location of the integration, and light condition. Methodology of data processing to quantify the capacity for each aspect is discussed. The minimal capability of the detectable size of the monitoring set up in the application is estimated by quantifying its resolution and accuracy. The quantification model is validated using a CCD camera-based monitoring system for LPBF machines in the laboratory with different setups. The result shows the model to quantify the monitoring system's performance, which makes the evaluation of monitoring systems with the same concept but different setups possible for the LPBF process and provides the direction to improve the setups.

Keywords: data processing, in-situ monitoring, LPBF process, optical system, quantization model, test artifact

Procedia PDF Downloads 197
5462 Application of Improved Semantic Communication Technology in Remote Sensing Data Transmission

Authors: Tingwei Shu, Dong Zhou, Chengjun Guo

Abstract:

Semantic communication is an emerging form of communication that realize intelligent communication by extracting semantic information of data at the source and transmitting it, and recovering the data at the receiving end. It can effectively solve the problem of data transmission under the situation of large data volume, low SNR and restricted bandwidth. With the development of Deep Learning, semantic communication further matures and is gradually applied in the fields of the Internet of Things, Uumanned Air Vehicle cluster communication, remote sensing scenarios, etc. We propose an improved semantic communication system for the situation where the data volume is huge and the spectrum resources are limited during the transmission of remote sensing images. At the transmitting, we need to extract the semantic information of remote sensing images, but there are some problems. The traditional semantic communication system based on Convolutional Neural Network cannot take into account the global semantic information and local semantic information of the image, which results in less-than-ideal image recovery at the receiving end. Therefore, we adopt the improved vision-Transformer-based structure as the semantic encoder instead of the mainstream one using CNN to extract the image semantic features. In this paper, we first perform pre-processing operations on remote sensing images to improve the resolution of the images in order to obtain images with more semantic information. We use wavelet transform to decompose the image into high-frequency and low-frequency components, perform bilinear interpolation on the high-frequency components and bicubic interpolation on the low-frequency components, and finally perform wavelet inverse transform to obtain the preprocessed image. We adopt the improved Vision-Transformer structure as the semantic coder to extract and transmit the semantic information of remote sensing images. The Vision-Transformer structure can better train the huge data volume and extract better image semantic features, and adopt the multi-layer self-attention mechanism to better capture the correlation between semantic features and reduce redundant features. Secondly, to improve the coding efficiency, we reduce the quadratic complexity of the self-attentive mechanism itself to linear so as to improve the image data processing speed of the model. We conducted experimental simulations on the RSOD dataset and compared the designed system with a semantic communication system based on CNN and image coding methods such as BGP and JPEG to verify that the method can effectively alleviate the problem of excessive data volume and improve the performance of image data communication.

Keywords: semantic communication, transformer, wavelet transform, data processing

Procedia PDF Downloads 79
5461 Computational and Experimental Determination of Acoustic Impedance of Internal Combustion Engine Exhaust

Authors: A. O. Glazkov, A. S. Krylova, G. G. Nadareishvili, A. S. Terenchenko, S. I. Yudin

Abstract:

The topic of the presented materials concerns the design of the exhaust system for a certain internal combustion engine. The exhaust system can be divided into two parts. The first is the engine exhaust manifold, turbocharger, and catalytic converters, which are called “hot part.” The second part is the gas exhaust system, which contains elements exclusively for reducing exhaust noise (mufflers, resonators), the accepted designation of which is the "cold part." The design of the exhaust system from the point of view of acoustics, that is, reducing the exhaust noise to a predetermined level, consists of working on the second part. Modern computer technology and software make it possible to design "cold part" with high accuracy in a given frequency range but with the condition of accurately specifying the input parameters, namely, the amplitude spectrum of the input noise and the acoustic impedance of the noise source in the form of an engine with a "hot part". Getting this data is a difficult problem: high temperatures, high exhaust gas velocities (turbulent flows), and high sound pressure levels (non-linearity mode) do not allow the calculated results to be applied with sufficient accuracy. The aim of this work is to obtain the most reliable acoustic output parameters of an engine with a "hot part" based on a complex of computational and experimental studies. The presented methodology includes several parts. The first part is a finite element simulation of the "cold part" of the exhaust system (taking into account the acoustic impedance of radiation of outlet pipe into open space) with the result in the form of the input impedance of "cold part". The second part is a finite element simulation of the "hot part" of the exhaust system (taking into account acoustic characteristics of catalytic units and geometry of turbocharger) with the result in the form of the input impedance of the "hot part". The next third part of the technique consists of the mathematical processing of the results according to the proposed formula for the convergence of the mathematical series of summation of multiple reflections of the acoustic signal "cold part" - "hot part". This is followed by conducting a set of tests on an engine stand with two high-temperature pressure sensors measuring pulsations in the nozzle between "hot part" and "cold part" of the exhaust system and subsequent processing of test results according to a well-known technique in order to separate the "incident" and "reflected" waves. The final stage consists of the mathematical processing of all calculated and experimental data to obtain a result in the form of a spectrum of the amplitude of the engine noise and its acoustic impedance.

Keywords: acoustic impedance, engine exhaust system, FEM model, test stand

Procedia PDF Downloads 59
5460 The Differences and Similarities in Neurocognitive Deficits in Mild Traumatic Brain Injury and Depression

Authors: Boris Ershov

Abstract:

Depression is the most common mood disorder experienced by patients who have sustained a traumatic brain injury (TBI) and is associated with poorer cognitive functional outcomes. However, in some cases, similar cognitive impairments can also be observed in depression. There is not enough information about the features of the cognitive deficit in patients with TBI in relation to patients with depression. TBI patients without depressive symptoms (TBInD, n25), TBI patients with depressive symptoms (TBID, n31), and 28 patients with bipolar II disorder (BP) were included in the study. There were no significant differences in participants in respect to age, handedness and educational level. The patients clinical status was determined by using Montgomery–Asberg Depression Rating Scale (MADRS). All participants completed a cognitive battery (The Brief Assessment of Cognition in Affective Disorders (BAC-A)). Additionally, the Rey–Osterrieth Complex Figure (ROCF) was used to assess visuospatial construction abilities and visual memory, as well as planning and organizational skills. Compared to BP, TBInD and TBID showed a significant impairments in visuomotor abilities, verbal and visual memory. There were no significant differences between BP and TBID groups in working memory, speed of information processing, problem solving. Interference effect (cognitive inhibition) was significantly greater in TBInD and TBID compared to BP. Memory bias towards mood-related information in BP and TBID was greater in comparison with TBInD. These results suggest that depressive symptoms are associated with impairments some executive functions in combination at decrease of speed of information processing.

Keywords: bipolar II disorder, depression, neurocognitive deficits, traumatic brain injury

Procedia PDF Downloads 347
5459 An Image Processing Scheme for Skin Fungal Disease Identification

Authors: A. A. M. A. S. S. Perera, L. A. Ranasinghe, T. K. H. Nimeshika, D. M. Dhanushka Dissanayake, Namalie Walgampaya

Abstract:

Nowadays, skin fungal diseases are mostly found in people of tropical countries like Sri Lanka. A skin fungal disease is a particular kind of illness caused by fungus. These diseases have various dangerous effects on the skin and keep on spreading over time. It becomes important to identify these diseases at their initial stage to control it from spreading. This paper presents an automated skin fungal disease identification system implemented to speed up the diagnosis process by identifying skin fungal infections in digital images. An image of the diseased skin lesion is acquired and a comprehensive computer vision and image processing scheme is used to process the image for the disease identification. This includes colour analysis using RGB and HSV colour models, texture classification using Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix and Local Binary Pattern, Object detection, Shape Identification and many more. This paper presents the approach and its outcome for identification of four most common skin fungal infections, namely, Tinea Corporis, Sporotrichosis, Malassezia and Onychomycosis. The main intention of this research is to provide an automated skin fungal disease identification system that increase the diagnostic quality, shorten the time-to-diagnosis and improve the efficiency of detection and successful treatment for skin fungal diseases.

Keywords: Circularity Index, Grey Level Run Length Matrix, Grey Level Co-Occurrence Matrix, Local Binary Pattern, Object detection, Ring Detection, Shape Identification

Procedia PDF Downloads 232
5458 Locating Potential Site for Biomass Power Plant Development in Central Luzon Philippines Using GIS-Based Suitability Analysis

Authors: Bryan M. Baltazar, Marjorie V. Remolador, Klathea H. Sevilla, Imee Saladaga, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang

Abstract:

Biomass energy is a traditional source of sustainable energy, which has been widely used in developing countries. The Philippines, specifically Central Luzon, has an abundant source of biomass. Hence, it could supply abundant agricultural residues (rice husks), as feedstock in a biomass power plant. However, locating a potential site for biomass development is a complex process which involves different factors, such as physical, environmental, socio-economic, and risks that are usually diverse and conflicting. Moreover, biomass distribution is highly dispersed geographically. Thus, this study develops an integrated method combining Geographical Information Systems (GIS) and methods for energy planning; Multi-Criteria Decision Analysis (MCDA) and Analytical Hierarchy Process (AHP), for locating suitable site for biomass power plant development in Central Luzon, Philippines by considering different constraints and factors. Using MCDA, a three level hierarchy of factors and constraints was produced, with corresponding weights determined by experts by using AHP. Applying the results, a suitability map for Biomass power plant development in Central Luzon was generated. It showed that the central part of the region has the highest potential for biomass power plant development. It is because of the characteristics of the area such as the abundance of rice fields, with generally flat land surfaces, accessible roads and grid networks, and low risks to flooding and landslide. This study recommends the use of higher accuracy resource maps, and further analysis in selecting the optimum site for biomass power plant development that would account for the cost and transportation of biomass residues.

Keywords: analytic hierarchy process, biomass energy, GIS, multi-criteria decision analysis, site suitability analysis

Procedia PDF Downloads 428