Search results for: central auditory processing disorder
6052 The Role of Executive Attention and Literacy on Consumer Memory
Authors: Fereshteh Nazeri Bahadori
Abstract:
In today's competitive environment, any company that aims to operate in a market, whether industrial or consumer markets, must know that it cannot address all the tastes and demands of customers at once and serve them all. The study of consumer memory is considered an important subject in marketing research, and many companies have conducted studies on this subject and the factors affecting it due to its importance. Therefore, the current study tries to investigate the relationship between consumers' attention, literacy, and memory. Memory has a very close relationship with learning. Memory is the collection of all the information that we have understood and stored. One of the important subjects in consumer behavior is information processing by the consumer. One of the important factors in information processing is the mental involvement of the consumer, which has attracted a lot of attention in the past two decades. Since consumers are the turning point of all marketing activities, successful marketing begins with understanding why and how consumers behave. Therefore, in the current study, the role of executive attention and literacy on consumers' memory has been investigated. The results showed that executive attention and literacy would play a significant role in the long-term and short-term memory of consumers.Keywords: literacy, consumer memory, executive attention, psychology of consumer behavior
Procedia PDF Downloads 966051 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation
Authors: C. Bunsanit
Abstract:
This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.Keywords: fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband
Procedia PDF Downloads 2266050 Communication Skills Training in Continuing Nursing Education: Enabling Nurses to Improve Competency and Performance in Communication
Authors: Marzieh Moattari Mitra Abbasi, Masoud Mousavinasab, Poorahmad
Abstract:
Background: Nurses in their daily practice need to communicate with patients and their families as well as health professional team members. Effective communication contributes to patients’ satisfaction which is a fundamental outcome of nursing practice. There are some evidences in support of patients' dissatisfaction with nurses’ performance in communication process. Therefore improving nurses’ communication skills is a necessity for nursing scholars and nursing administrators. Objective: The aim of the present study was to evaluate the effect of a 2-days workshop on nurses’ competencies and performances in communication in a central hospital located in the sought of Iran. Materials and Method: This is a randomized controlled trial which comprised of a convenient sample of 70 eligible nurses, working in a central hospital. They were randomly divided into 2 experimental and control groups. Nurses’ competencies was measured by an Objective Structured Clinical Examination (OSCE) and their performance was measured by asking eligible patients hospitalized in the nurses work setting during a one month period to evaluate nurses' communication skills before and 2 months after intervention. The experimental group participated in a 2 day workshop on communication skills. Content included in this workshop were: the importance of communication (verbal and non verbal), basic communication skills such as initiating the communication, active listening and questioning technique. Other subjects were patient teaching, problem solving, and decision making, cross cultural communication and breaking bad news. Appropriate teaching strategies such as brief didactic sessions, small group discussion and reflection were applied to enhance participants learning. The data was analyzed using SPSS 16. Result: A significant between group differences was found in nurses’ communication skills competencies and performances in the posttest. The mean scores of the experimental group was higher than that of the control group in the total score of OSCE as well as all stations of OSCE (p<0.003). Overall posttest mean scores of patient satisfaction with nurse's communication skills and all of its four dimensions significantly differed between the two groups of the study (p<0.001). Conclusion: This study shows that the education of nurses in communication skills, improves their competencies and performances. Measurement of Nurses’ communication skills as a central component of efficient nurse patient relationship by valid and reliable methods of evaluation is recommended. Also it is necessary to integrate teaching of communication skills in continuing nursing education programs. Trial Registration Number: IRCT201204042621N11Keywords: communication skills, simulation, performance, competency, objective structure, clinical evaluation
Procedia PDF Downloads 2186049 Agent-Based Modeling Investigating Self-Organization in Open, Non-equilibrium Thermodynamic Systems
Authors: Georgi Y. Georgiev, Matthew Brouillet
Abstract:
This research applies the power of agent-based modeling to a pivotal question at the intersection of biology, computer science, physics, and complex systems theory about the self-organization processes in open, complex, non-equilibrium thermodynamic systems. Central to this investigation is the principle of Maximum Entropy Production (MEP). This principle suggests that such systems evolve toward states that optimize entropy production, leading to the formation of structured environments. It is hypothesized that guided by the least action principle, open thermodynamic systems identify and follow the shortest paths to transmit energy and matter, resulting in maximal entropy production, internal structure formation, and a decrease in internal entropy. Concurrently, it is predicted that there will be an increase in system information as more information is required to describe the developing structure. To test this, an agent-based model is developed simulating an ant colony's formation of a path between a food source and its nest. Utilizing the Netlogo software for modeling and Python for data analysis and visualization, self-organization is quantified by calculating the decrease in system entropy based on the potential states and distribution of the ants within the simulated environment. External entropy production is also evaluated for information increase and efficiency improvements in the system's action. Simulations demonstrated that the system begins at maximal entropy, which decreases as the ants form paths over time. A range of system behaviors contingent upon the number of ants are observed. Notably, no path formation occurred with fewer than five ants, whereas clear paths were established by 200 ants, and saturation of path formation and entropy state was reached at populations exceeding 1000 ants. This analytical approach identified the inflection point marking the transition from disorder to order and computed the slope at this point. Combined with extrapolation to the final path entropy, these parameters yield important insights into the eventual entropy state of the system and the timeframe for its establishment, enabling the estimation of the self-organization rate. This study provides a novel perspective on the exploration of self-organization in thermodynamic systems, establishing a correlation between internal entropy decrease rate and external entropy production rate. Moreover, it presents a flexible framework for assessing the impact of external factors like changes in world size, path obstacles, and friction. Overall, this research offers a robust, replicable model for studying self-organization processes in any open thermodynamic system. As such, it provides a foundation for further in-depth exploration of the complex behaviors of these systems and contributes to the development of more efficient self-organizing systems across various scientific fields.Keywords: complexity, self-organization, agent based modelling, efficiency
Procedia PDF Downloads 686048 Use of Artificial Intelligence and Two Object-Oriented Approaches (k-NN and SVM) for the Detection and Characterization of Wetlands in the Centre-Val de Loire Region, France
Authors: Bensaid A., Mostephaoui T., Nedjai R.
Abstract:
Nowadays, wetlands are the subject of contradictory debates opposing scientific, political and administrative meanings. Indeed, given their multiple services (drinking water, irrigation, hydrological regulation, mineral, plant and animal resources...), wetlands concentrate many socio-economic and biodiversity issues. In some regions, they can cover vast areas (>100 thousand ha) of the landscape, such as the Camargue area in the south of France, inside the Rhone delta. The high biological productivity of wetlands, the strong natural selection pressures and the diversity of aquatic environments have produced many species of plants and animals that are found nowhere else. These environments are tremendous carbon sinks and biodiversity reserves depending on their age, composition and surrounding environmental conditions, wetlands play an important role in global climate projections. Covering more than 3% of the earth's surface, wetlands have experienced since the beginning of the 1990s a tremendous revival of interest, which has resulted in the multiplication of inventories, scientific studies and management experiments. The geographical and physical characteristics of the wetlands of the central region conceal a large number of natural habitats that harbour a great biological diversity. These wetlands, one of the natural habitats, are still influenced by human activities, especially agriculture, which affects its layout and functioning. In this perspective, decision-makers need to delimit spatial objects (natural habitats) in a certain way to be able to take action. Thus, wetlands are no exception to this rule even if it seems to be a difficult exercise to delimit a type of environment as whose main characteristic is often to occupy the transition between aquatic and terrestrial environment. However, it is possible to map wetlands with databases, derived from the interpretation of photos and satellite images, such as the European database Corine Land cover, which allows quantifying and characterizing for each place the characteristic wetland types. Scientific studies have shown limitations when using high spatial resolution images (SPOT, Landsat, ASTER) for the identification and characterization of small wetlands (1 hectare). To address this limitation, it is important to note that these wetlands generally represent spatially complex features. Indeed, the use of very high spatial resolution images (>3m) is necessary to map small and large areas. However, with the recent evolution of artificial intelligence (AI) and deep learning methods for satellite image processing have shown a much better performance compared to traditional processing based only on pixel structures. Our research work is also based on spectral and textural analysis on THR images (Spot and IRC orthoimage) using two object-oriented approaches, the nearest neighbour approach (k-NN) and the Super Vector Machine approach (SVM). The k-NN approach gave good results for the delineation of wetlands (wet marshes and moors, ponds, artificial wetlands water body edges, ponds, mountain wetlands, river edges and brackish marshes) with a kappa index higher than 85%.Keywords: land development, GIS, sand dunes, segmentation, remote sensing
Procedia PDF Downloads 726047 Effect of Some Metal Ions on the Activity of Lipase Produced by Aspergillus Niger Cultured on Vitellaria Paradoxa Shells
Authors: Abdulhakeem Sulyman, Olukotun Zainab, Hammed Abdulquadri
Abstract:
Lipases (triacylglycerol acyl hydrolases) (EC 3.1.1.3) are class of enzymes that catalyses the hydrolysis of triglycerides to glycerol and free fatty acids. They account for up to 10% of the enzyme in the market and have a wide range of applications in biofuel production, detergent formulation, leather processing and in food and feed processing industry. This research was conducted to study the effect of some metal ions on the activity of purified lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells. Purified lipase in 12.5 mM p-NPL was incubated with different metal ions (Zn²⁺, Ca²⁺, Mn²⁺, Fe²⁺, Na⁺, K⁺ and Mg²⁺). The final concentrations of metal ions investigated were 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 and 1.0 mM. The results obtained from the study showed that Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ ions increased the activity of lipase up to 3.0, 3.0, 1.0, and 26.0 folds respectively. Lipase activity was partially inhibited by Na⁺ and Mg²⁺ with up to 88.5% and 83.7% loss of activity respectively. Lipase activity was also inhibited by K⁺ with up to 56.7% loss in the activity as compared to in the absence of metal ions. The study concluded that lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells can be activated by the presence of Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ and inhibited by Na⁺, K⁺ and Mg²⁺.Keywords: Aspergillus niger, Vitellaria paradoxa, lipase, metal ions
Procedia PDF Downloads 1506046 Online Monitoring Rheological Property of Polymer Melt during Injection Molding
Authors: Chung-Chih Lin, Chien-Liang Wu
Abstract:
The detection of the polymer melt state during manufacture process is regarded as an efficient way to control the molded part quality in advance. Online monitoring rheological property of polymer melt during processing procedure provides an approach to understand the melt state immediately. Rheological property reflects the polymer melt state at different processing parameters and is very important in injection molding process especially. An approach that demonstrates how to calculate rheological property of polymer melt through in-process measurement, using injection molding as an example, is proposed in this study. The system consists of two sensors and a data acquisition module can process the measured data, which are used for the calculation of rheological properties of polymer melt. The rheological properties of polymer melt discussed in this study include shear rate and viscosity which are investigated with respect to injection speed and melt temperature. The results show that the effect of injection speed on the rheological properties is apparent, especially for high melt temperature and should be considered for precision molding process.Keywords: injection molding, melt viscosity, shear rate, monitoring
Procedia PDF Downloads 3816045 The First Report of Aberrant Corneal Occlusion in Rabbit in Iran
Authors: Bahador Bardshiri, Omid Moradi, Amir Komeilian, Nima Panahifar
Abstract:
Formation of a conjunctival membrane over the corneal surface is a condition unique to rabbits that has been labeled aberrant corneal occlusion or pseudopterygium. In the summer of 2013, a five years old male Standard Chinchilla rabbit were presented to Karaj Central Veterinary hospital and the owner complained that his rabbit shows degrees of blindness and there were opacities on both eyes of the presented rabbit. Ophthalmic examination of the affected eyes revealed a conjunctival fold stretching over the cornea of both eyes. The fold originated from limbus and it was vascularized and centrally thickened. There were no attachments to the corneal epithelium and the fold could be easily lifted. Surgery was performed under general anesthesia. The conjunctival fold was incised centrifugally up to its attachment at the limbus and the lid margin using small scissors. The central rim of the segment was then replaced to its normal position in the fornix and fixed with mattress sutures (7/0) passing through outside skin. After the surgery, eye drops containing dexamethasone, gentamicin and polymixin were applied twice daily up to 3 weeks. Within the observation period (8 months) no recurrence was noted. "Pseudo" in the term pseudopterygium refers to the fact that the conjunctival membrane is not adhering to the underlying cornea, but growing over it. In rare cases, the membrane may be loosely attached to the cornea, but can be easily separated without causing damage. It can cover only a small part of the cornea with an annular peripheral opacification of the cornea, or cover it almost fully, leading to blindness. Ethiopathogenesis remains unclear and recurrence of the problem is very likely. The surgical technique that used here decreases probability of recurrence of conjunctival fold.Keywords: rabbit, cornea, aberrant corneal occlusion, pseudopterygium
Procedia PDF Downloads 3416044 Portuguese City Reconstructed from Public Space: The Example of the Requalification of Cacém Central Area
Authors: Rodrigo Coelho
Abstract:
As several authors have pointed out (such as Jordi Borja, or Oriol Bohigas), the necessity to “make center” presents itself not only as a imperative response to deal with the processes of dissolution of peripheral urbanization, as it should be assumed, from the point of view its symbolic and functional meaning, as a key concept to think and act on the enlarged city. The notion of re-centralization (successfully applied in urban periphery recompositions, such as in Barcelona or Lyon), understood from the redefinition of mobility, the strengthening of core functions, and from the creation or consolidation of urban fabrics (always articulated with policies of creation and redevelopment of public spaces), seems to become one of the key strategies over the challenge of making the city on the “city periphery”. The question we want to address in this paper concerns, essentially, the importance of public space in the (re) construction of the contemporary "shapeless city” sectors (which, in general, we associate to urban peripheries). We will seek demonstrate, from the analysis of a Portuguese case study–The Cacém Central Area requalification, integrated in Polis Program (National Program for Urban Rehabilitation and Environmental Improvement of Cities, released in 1999 by the Portuguese government), the conditions under which the public space project can act, subsequently, in the urban areas of recent formation, where, in many situations, the public space did not have a structuring role in its urbanization, seeing its presence reduced to a residual character. More specifically, we intend to demonstrate with this example the methodological and urban design aspects that led to the regeneration of a disqualified and degraded urban area, by intervening consistently and profoundly in public space (with well defined objectives and criteria, and framed in a more comprehensive strategy, attentive to the various scales of urban design).Keywords: public space, urban design, urban regeneration, urban and regional studies
Procedia PDF Downloads 5786043 Detection of Image Blur and Its Restoration for Image Enhancement
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.Keywords: image enhancement, motion analysis, motion detection, motion estimation
Procedia PDF Downloads 2876042 Suicide Wrongful Death: Standard of Care Problems Involving the Inaccurate Discernment of Lethal Risk When Focusing on the Elicitation of Suicide Ideation
Authors: Bill D. Geis
Abstract:
Suicide wrongful death forensic cases are the fastest rising tort in mental health law. It is estimated that suicide-related cases have accounted for 15% of U.S. malpractice claims since 2006. Most suicide-related personal injury claims fall into the legal category of “wrongful death.” Though mental health experts may be called on to address a range of forensic questions in wrongful death cases, the central consultation that most experts provide is about the negligence element—specifically, the issue of whether the clinician met the clinical standard of care in assessing, treating, and managing the deceased person’s mental health care. Standards of care, varying from U.S. state to state, are broad and address what a reasonable clinician might do in a similar circumstance. This fact leaves the issue of the suicide standard of care, in each case, up to forensic experts to put forth a reasoned estimate of what the standard of care should have been in the specific case under litigation. Because the general state guidelines for standard of care are broad, forensic experts are readily retained to provide scientific and clinical opinions about whether or not a clinician met the standard of care in their suicide assessment, treatment, and management of the case. In the past and in much of current practice, the assessment of suicide has centered on the elicitation of verbalized suicide ideation. Research in recent years, however, has indicated that the majority of persons who end their lives do not say they are suicidal at their last medical or psychiatric contact. Near-term risk assessment—that goes beyond verbalized suicide ideation—is needed. Our previous research employed structural equation modeling to predict lethal suicide risk--eight negative thought patterns (feeling like a burden on others, hopelessness, self-hatred, etc.) mediated by nine transdiagnostic clinical factors (mental torment, insomnia, substance abuse, PTSD intrusions, etc.) were combined to predict acute lethal suicide risk. This structural equation model, the Lethal Suicide Risk Pattern (LSRP), Acute model, had excellent goodness-of-fit [χ2(df) = 94.25(47)***, CFI = .98, RMSEA = .05, .90CI = .03-.06, p(RMSEA = .05) = .63. AIC = 340.25, ***p < .001.]. A further SEQ analysis was completed for this paper, adding a measure of Acute Suicide Ideation to the previous SEQ. Acceptable prediction model fit was no longer achieved [χ2(df) = 3.571, CFI > .953, RMSEA = .075, .90% CI = .065-.085, AIC = 529.550].This finding suggests that, in this additional study, immediate verbalized suicide ideation information was unhelpful in the assessment of lethal risk. The LSRP and other dynamic, near-term risk models (such as the Acute Suicide Affective Disorder Model and the Suicide Crisis Syndrome Model)—going beyond elicited suicide ideation—need to be incorporated into current clinical suicide assessment training. Without this training, the standard of care for suicide assessment is out of sync with current research—an emerging dilemma for the forensic evaluation of suicide wrongful death cases.Keywords: forensic evaluation, standard of care, suicide, suicide assessment, wrongful death
Procedia PDF Downloads 686041 Efficient of Technology Remediation Soil That Contaminated by Petroleum Based on Heat without Combustion
Authors: Gavin Hutama Farandiarta, Hegi Adi Prabowo, Istiara Rizqillah Hanifah, Millati Hanifah Saprudin, Raden Iqrafia Ashna
Abstract:
The increase of the petroleum’s consumption rate encourages industries to optimize and increase the activity in processing crude oil into petroleum. However, although the result gives a lot of benefits to humans worldwide, it also gives negative impact to the environment. One of the negative impacts of processing crude oil is the soil will be contaminated by petroleum sewage sludge. This petroleum sewage sludge, contains hydrocarbon compound and it can be calculated by Total Petroleum Hydrocarbon (TPH).Petroleum sludge waste is accounted as hazardous and toxic. The soil contamination caused by the petroleum sludge is very hard to get rid of. However, there is a way to manage the soil that is contaminated by petroleum sludge, which is by using heat (thermal desorption) in the process of remediation. There are several factors that affect the success rate of the remediation with the help of heat which are temperature, time, and air pressure in the desorption column. The remediation process using the help of heat is an alternative in soil recovery from the petroleum pollution which highly effective, cheap, and environmentally friendly that produces uncontaminated soil and the petroleum that can be used again.Keywords: petroleum sewage sludge, remediation soil, thermal desorption, total petroleum hydrocarbon (TPH)
Procedia PDF Downloads 2476040 Temporal Profile of Exercise-Induced Changes in Plasma Brain-Derived Neurotrophic Factor Levels of Schizophrenic Individuals
Authors: Caroline Lavratti, Pedro Dal Lago, Gustavo Reinaldo, Gilson Dorneles, Andreia Bard, Laira Fuhr, Daniela Pochmann, Alessandra Peres, Luciane Wagner, Viviane Elsner
Abstract:
Approximately 1% of the world's population is affected by schizophrenia (SZ), a chronic and debilitating neurodevelopmental disorder. Among possible factors, reduced levels of Brain-derived neurotrophic factor (BDNF) has been recognized in physiopathogenesis and course of SZ. In this context, peripheral BDNF levels have been used as a biomarker in several clinical studies, since this neurotrophin is able to cross the blood-brain barrier in a bi-directional manner and seems to present a strong correlation with the central nervous system fluid levels. The patients with SZ usually adopts a sedentary lifestyle, which has been partly associated with the increase in obesity incidence rates, metabolic syndrome, type 2 diabetes and coronary heart disease. On the other hand, exercise, a non-invasive and low cost intervention, has been considered an important additional therapeutic option for this population, promoting benefits to physical and mental health. To our knowledge, few studies have been pointed out that the positive effects of exercise in SZ patients are mediated, at least in part, to enhanced levels of BDNF after training. However, these studies are focused on evaluating the effect of single bouts of exercise of chronic interventions, data concerning the short- and long-term exercise outcomes on BDNF are scarce. Therefore, this study aimed to evaluate the effect of a concurrent exercise protocol (CEP) on plasma BDNF levels of SZ patients in different time-points. Material and Methods: This study was approved by the Research Ethics Committee of the Centro Universitário Metodista do IPA (no 1.243.680/2015). The participants (n=15) were subbmited to the CEP during 90 days, 3 times a week for 60 minutes each session. In order to evaluate the short and long-term effects of exercise, blood samples were collected pre, 30, 60 and 90 days after the intervention began. Plasma BDNF levels were determined with the ELISA method, from Sigma-Aldrich commercial kit (catalog number RAB0026) according to manufacturer's instructions. Results: A remarkable increase on plasma BDNF levels at 90 days after training compared to baseline (p=0.006) and 30 days (p=0.007) values were observed. Conclusion: Our data are in agreement with several studies that show significant enhancement on BDNF levels in response to different exercise protocols in SZ individuals. We might suggest that BDNF upregulation after training in SZ patients acts in a dose-dependent manner, being more pronounced in response to chronic exposure. Acknowledgments: This work was supported by Fundação de Amparo à Pesquisa do Estado do Rio Grande do Sul (FAPERGS)/Brazil.Keywords: exercise, BDNF, schizophrenia, time-points
Procedia PDF Downloads 2526039 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life
Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar
Abstract:
In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home
Procedia PDF Downloads 1136038 Evaluation of the Urban Landscape Structures and Dynamics of Hawassa City, Using Satellite Images and Spatial Metrics Approaches, Ethiopia
Authors: Berhanu Terfa, Nengcheng C.
Abstract:
The study deals with the analysis of urban expansion and land transformation of Hawass City using remote sensing data and landscape metrics during last three decades (1987–2017). Remote sensing data from Various multi-temporal satellite images viz., TM (1987), TM (1995), ETM+ (2005) and OLI (2017) were used to examine the urban expansion, growth types, and spatial isolation within the urban landscape to develop an understanding the trends of built-up growth in Hawassa City, Ethiopia. Landscape metrics and built-up density were employed to analyze the pattern, process and overall growth status. The area under investigation was divided into concentric circles with a consecutive circle of 1 km incremental radius from the central pixel (Central Business District) for analysis. The result exhibited that the built-up area had increased by 541.32% between 1987 and 2017and an extension growth types (more than 67 %) was observed. The major growth took place in north-west direction followed by north direction in haphazard manner during 1987–1995 period, whereas predominant built-up development was observed in south and southwest direction during 1995–2017 period. Land scape metrics result revealed that the of urban patches density, total edge and edge density increased, while mean nearest neighbors’ distance decreased showing the tendency of sprawl.Keywords: landscape metrics, spatial patterns, remote sensing, multi-temporal, urban sprawl
Procedia PDF Downloads 2866037 Deep Learning Based Road Crack Detection on an Embedded Platform
Authors: Nurhak Altın, Ayhan Kucukmanisa, Oguzhan Urhan
Abstract:
It is important that highways are in good condition for traffic safety. Road crashes (road cracks, erosion of lane markings, etc.) can cause accidents by affecting driving. Image processing based methods for detecting road cracks are available in the literature. In this paper, a deep learning based road crack detection approach is proposed. YOLO (You Look Only Once) is adopted as core component of the road crack detection approach presented. The YOLO network structure, which is developed for object detection, is trained with road crack images as a new class that is not previously used in YOLO. The performance of the proposed method is compared using different training methods: using randomly generated weights and training their own pre-trained weights (transfer learning). A similar training approach is applied to the simplified version of the YOLO network model (tiny yolo) and the results of the performance are examined. The developed system is able to process 8 fps on NVIDIA Jetson TX1 development kit.Keywords: deep learning, embedded platform, real-time processing, road crack detection
Procedia PDF Downloads 3396036 Geophysical Methods of Mapping Groundwater Aquifer System: Perspectives and Inferences From Lisana Area, Western Margin of the Central Main Ethiopian Rift
Authors: Esubalew Yehualaw Melaku, Tigistu Haile Eritro
Abstract:
In this study, two basic geophysical methods are applied for mapping the groundwater aquifer system in the Lisana area along the Guder River, northeast of Hosanna town, near the western margin of the Central Main Ethiopian Rift. The main target of the study is to map the potential aquifer zone and investigate the groundwater potential for current and future development of the resource in the Gode area. The geophysical methods employed in this study include, Vertical Electrical Sounding (VES) and magnetic survey techniques. Electrical sounding was used to examine and map the depth to the potential aquifer zone of the groundwater and its distribution over the area. On the other hand, a magnetic survey was used to delineate contact between lithologic units and geological structures. The 2D magnetic modeling and the geoelectric sections are used for the identification of weak zones, which control the groundwater flow and storage system. The geophysical survey comprises of twelve VES readings collected by using a Schlumberger array along six profile lines and more than four hundred (400) magnetic readings at about 10m station intervals along four profiles and 20m along three random profiles. The study result revealed that the potential aquifer in the area is obtained at a depth range from 45m to 92m. This is the response of the highly weathered/ fractured ignimbrite and pumice layer with sandy soil, which is the main water-bearing horizon. Overall, in the neighborhood of four VES points, VES- 2, VES- 3, VES-10, and VES-11, shows good water-bearing zones in the study area.Keywords: vertical electrical sounding, magnetic survey, aquifer, groundwater potential
Procedia PDF Downloads 796035 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act
Authors: Maria Jędrzejczak, Patryk Pieniążek
Abstract:
The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.Keywords: data protection law, personal data, AI law, personal data breach
Procedia PDF Downloads 656034 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 2746033 Disentangling Palliative Care and Euthanasia/Assisted Suicide in Dementia Care
Authors: Michael Joseph Passmore
Abstract:
Euthanasia, or assisted suicide (EAS), refers to the provision of medical assistance to individuals seeking to end their own lives. In Canada, the issue of EAS has been the subject of debate and legislative action for many years. In 2016, the Canadian government passed the Medical Assistance in Dying (MAID) Act. This legalized EAS in Canada is subject to certain eligibility criteria. In 2023, debate in Canada continues regarding the scope of MAID practice and associated legislation. Dementia is an illness that causes suffering at the end of life. Persons suffering due to dementia deserve timely and effective palliative care.Keywords: palliative care, neurocognitive disorder, dementia, Alzheimer’s disease, euthanasia, assisted suicide, medical ethics, bioethics
Procedia PDF Downloads 926032 An ERP Study of Chinese Pseudo-Object Structures
Authors: Changyin Zhou
Abstract:
Verb-argument relation is a very important aspect of syntax-semantics interaction in sentence processing. Previous ERP (event related potentials) studies in this field mainly concentrated on the relation between the verb and its core arguments. The present study aims to reveal the ERP pattern of Chinese pseudo-object structures (SOSs), in which a peripheral argument is promoted to occupy the position of the patient object, as compared with the patient object structures (POSs). The ERP data were collected when participants were asked to perform acceptability judgments about Chinese phrases. Our result shows that, similar to the previous studies of number-of-argument violations, Chinese SOSs show a bilaterally distributed N400 effect. But different from all the previous studies of verb-argument relations, Chinese SOSs demonstrate a sustained anterior positivity (SAP). This SAP, which is the first report related to complexity of argument structure operation, reflects the integration difficulty of the newly promoted arguments and the progressive nature of well-formedness checking in the processing of Chinese SOSs.Keywords: Chinese pseudo-object structures, ERP, sustained anterior positivity, verb-argument relation
Procedia PDF Downloads 4346031 Accuracy/Precision Evaluation of Excalibur I: A Neurosurgery-Specific Haptic Hand Controller
Authors: Hamidreza Hoshyarmanesh, Benjamin Durante, Alex Irwin, Sanju Lama, Kourosh Zareinia, Garnette R. Sutherland
Abstract:
This study reports on a proposed method to evaluate the accuracy and precision of Excalibur I, a neurosurgery-specific haptic hand controller, designed and developed at Project neuroArm. Having an efficient and successful robot-assisted telesurgery is considerably contingent on how accurate and precise a haptic hand controller (master/local robot) would be able to interpret the kinematic indices of motion, i.e., position and orientation, from the surgeon’s upper limp to the slave/remote robot. A proposed test rig is designed and manufactured according to standard ASTM F2554-10 to determine the accuracy and precision range of Excalibur I at four different locations within its workspace: central workspace, extreme forward, far left and far right. The test rig is metrologically characterized by a coordinate measuring machine (accuracy and repeatability < ± 5 µm). Only the serial linkage of the haptic device is examined due to the use of the Structural Length Index (SLI). The results indicate that accuracy decreases by moving from the workspace central area towards the borders of the workspace. In a comparative study, Excalibur I performs on par with the PHANToM PremiumTM 3.0 and more accurate/precise than the PHANToM PremiumTM 1.5. The error in Cartesian coordinate system shows a dominant component in one direction (δx, δy or δz) for the movements on horizontal, vertical and inclined surfaces. The average error magnitude of three attempts is recorded, considering all three error components. This research is the first promising step to quantify the kinematic performance of Excalibur I.Keywords: accuracy, advanced metrology, hand controller, precision, robot-assisted surgery, tele-operation, workspace
Procedia PDF Downloads 3366030 Thermo-Mechanical Processing Scheme to Obtain Micro-Duplex Structure Favoring Superplasticity in an As-Cast and Homogenized Medium Alloyed Nickel Base Superalloy
Authors: K. Sahithya, I. Balasundar, Pritapant, T. Raghua
Abstract:
Ni-based superalloy with a nominal composition Ni-14% Cr-11% Co-5.8% Mo-2.4% Ti-2.4% Nb-2.8% Al-0.26 % Fe-0.032% Si-0.069% C (all in wt %) is used as turbine discs in a variety of aero engines. Like any other superalloy, the primary processing of the as-cast superalloy poses a major challenge due to its complex alloy chemistry. The challenge was circumvented by characterizing the different phases present in the material, optimizing the homogenization treatment, identifying a suitable thermomechanical processing window using dynamic materials modeling. The as-cast material was subjected to homogenization at 1200°C for a soaking period of 8 hours and quenched using different media. Water quenching (WQ) after homogenization resulted in very fine spherical γꞌ precipitates of sizes 30-50 nm, whereas furnace cooling (FC) after homogenization resulted in bimodal distribution of precipitates (primary gamma prime of size 300nm and secondary gamma prime of size 5-10 nm). MC type primary carbides that are stable till the melting point of the material were found in both WQ and FC samples. Deformation behaviour of both the materials below (1000-1100°C) and above gamma prime solvus (1100-1175°C) was evaluated by subjecting the material to series of compression tests at different constant true strain rates (0.0001/sec-1/sec). An in-detail examination of the precipitate dislocation interaction mechanisms carried out using TEM revealed precipitate shearing and Orowan looping as the mechanisms governing deformation in WQ and FC, respectively. Incoherent/semi coherent gamma prime precipitates in the case of FC material facilitates better workability of the material, whereas the coherent precipitates in WQ material contributed to higher resistance to deformation of the material. Both the materials exhibited discontinuous dynamic recrystallization (DDRX) above gamma prime solvus temperature. The recrystallization kinetics was slower in the case of WQ material. Very fine grain boundary carbides ( ≤ 300 nm) retarded the recrystallisation kinetics in WQ. Coarse carbides (1-5 µm) facilitate particle stimulated nucleation in FC material. The FC material was cogged (primary hot working) 1120˚C, 0.03/sec resulting in significant grain refinement, i.e., from 3000 μm to 100 μm. The primary processed material was subjected to intensive thermomechanical deformation subsequently by reducing the temperature by 50˚C in each processing step with intermittent heterogenization treatment at selected temperatures aimed at simultaneous coarsening of the gamma prime precipitates and refinement of the gamma matrix grains. The heterogeneous annealing treatment carried out, resulted in gamma grains of 10 μm and gamma prime precipitates of 1-2 μm. Further thermo mechanical processing of the material was carried out at 1025˚C to increase the homogeneity of the obtained micro-duplex structure.Keywords: superalloys, dynamic material modeling, nickel alloys, dynamic recrystallization, superplasticity
Procedia PDF Downloads 1216029 HLA-DPB1 Matching on the Outcome of Unrelated Donor Hematopoietic Stem Cell Transplantation
Authors: Shi-xia Xu, Zai-wen Zhang, Ru-xue Chen, Shan Zhou, Xiang-feng Tang
Abstract:
Objective: The clinical influence of HLA-DPB1 mismatches on clinical outcome of HSCT is less clear. This is the first meta-analysis to study the HLA-DPB1 matching statues on clinical outcomes after unrelated donor HSCT. Methods: We searched the CIBMTR, Cochrane Central Register of Controlled Trials (CENTRAL) and related databases (1995.01–2017.06) for all relevant articles. Comparative studies were used to investigate the HLA-DPB1 loci mismatches on clinical outcomes after unrelated donor HSCT, such as the disease-free survival (DFS), overall survival, GVHD, relapse, and transplant-related mortality (TRM). We performed meta-analysis using Review Manager 5.2 software and funnel plot to assess the bias. Results: At first, 1246 articles were retrieved, and 18 studies totaling 26368 patients analyzed. Pooled comparisons of studies found that the HLA-DPB1 mismatched group had a lower rate of DFS than the DPB1-matched group, and lower OS in non-T cell depleted transplantation. The DPB1 mismatched group has a higher incidence of aGVHD and more severe ( ≥ III degree) aGvHD, lower rate of relapse and higher TRM. Moreover, compared with 1-antigen mismatch, 2-antigen mismatched led to a higher risk of TRM and lower relapse rate. Conclusions: This meta-analysis indicated HLA-DPB1 has important influence on survival and transplant-related complications during unrelated donor HSCT and HLA-DPB1 donor selection strategies have been proposed based on a personalized algorithm.Keywords: human leukocyte antigen, DPB1, transplant, meta-analysis, outcome
Procedia PDF Downloads 2986028 Functional Neural Network for Decision Processing: A Racing Network of Programmable Neurons Where the Operating Model Is the Network Itself
Authors: Frederic Jumelle, Kelvin So, Didan Deng
Abstract:
In this paper, we are introducing a model of artificial general intelligence (AGI), the functional neural network (FNN), for modeling human decision-making processes. The FNN is composed of multiple artificial mirror neurons (AMN) racing in the network. Each AMN has a similar structure programmed independently by the users and composed of an intention wheel, a motor core, and a sensory core racing at a specific velocity. The mathematics of the node’s formulation and the racing mechanism of multiple nodes in the network will be discussed, and the group decision process with fuzzy logic and the transformation of these conceptual methods into practical methods of simulation and in operations will be developed. Eventually, we will describe some possible future research directions in the fields of finance, education, and medicine, including the opportunity to design an intelligent learning agent with application in AGI. We believe that FNN has a promising potential to transform the way we can compute decision-making and lead to a new generation of AI chips for seamless human-machine interactions (HMI).Keywords: neural computing, human machine interation, artificial general intelligence, decision processing
Procedia PDF Downloads 1256027 Increasing Sulfur Handling Cost Efficiency Using the Eco Sulfur Paving Block Method at PT Pertamina EP Field Cepu
Authors: Adha Bayu Wijaya, A. Zainal Abidin, Naufal Baihaqi, Joko Suprayitno, Astika Titistiti, Muslim Adi Wijaya, Endah Tri Lestari, Agung Wibowo
Abstract:
Sulfur is a non-metallic chemical element in the form of a yellow crystalline solid with the chemical formula, and is formed from several types of natural and artificial chemical reactions. Commercial applications of sulfur processed products can be found in various aspects of life, for example in the use of processed sulfur as paving blocks. The Gundih Central Processing Plant (CPP) is capable of producing 14 tons/day of sulfur pellets. This amount comes from the high H2S content of the wells with a total concentration of 20,000 ppm and a volume accumulation of 14 MMSCFD acid gas. H2S is converted to sulfur using the thiobacillus microbe in the Biological Sulfur Recovery Unit (BSRU) with a sulfur product purity level greater than 95%. In 2018 sulfur production at Gundih CPP was recorded at 4044 tons which could potentially trigger serious problems from an environmental aspect. The use of sulfur as material for making paving blocks is an alternative solution in addressing the potential impact on the environment, as regulated by Government Regulation No.22 of Year 2021 concerning the Waste Management of Non-Hazardous and Toxic Substances (B3), and the high cost of handling sulfur by third parties. The design mix of ratio sulfur paving blocks is 22% cements, rock ash 67%, and 11% of sulfur pellets. The sulfur used in making the paving mixture is pure sulfur, namely the side product category without any contaminants, thereby eliminating the potential for environmental pollution when implementing sulfur paving. Strength tests of sulfur paving materials have also been confirmed by external laboratories. The standard used in making sulfur paving blocks refers to the SNI 03-0691-1996 standard. With the results of sulfur paving blocks made according to quality B. Currently, sulfur paving blocks are used in building access to wells locations and in public roads in the Cepu Field area as a contribution from Corporate Social Responsibility (CSR).Keywords: sulphur, innovation, paving block, CSR, sulphur paving
Procedia PDF Downloads 756026 Wireless FPGA-Based Motion Controller Design by Implementing 3-Axis Linear Trajectory
Authors: Kiana Zeighami, Morteza Ozlati Moghadam
Abstract:
Designing a high accuracy and high precision motion controller is one of the important issues in today’s industry. There are effective solutions available in the industry but the real-time performance, smoothness and accuracy of the movement can be further improved. This paper discusses a complete solution to carry out the movement of three stepper motors in three dimensions. The objective is to provide a method to design a fully integrated System-on-Chip (SOC)-based motion controller to reduce the cost and complexity of production by incorporating Field Programmable Gate Array (FPGA) into the design. In the proposed method the FPGA receives its commands from a host computer via wireless internet communication and calculates the motion trajectory for three axes. A profile generator module is designed to realize the interpolation algorithm by translating the position data to the real-time pulses. This paper discusses an approach to implement the linear interpolation algorithm, since it is one of the fundamentals of robots’ movements and it is highly applicable in motion control industries. Along with full profile trajectory, the triangular drive is implemented to eliminate the existence of error at small distances. To integrate the parallelism and real-time performance of FPGA with the power of Central Processing Unit (CPU) in executing complex and sequential algorithms, the NIOS II soft-core processor was added into the design. This paper presents different operating modes such as absolute, relative positioning, reset and velocity modes to fulfill the user requirements. The proposed approach was evaluated by designing a custom-made FPGA board along with a mechanical structure. As a result, a precise and smooth movement of stepper motors was observed which proved the effectiveness of this approach.Keywords: 3-axis linear interpolation, FPGA, motion controller, micro-stepping
Procedia PDF Downloads 2086025 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program
Authors: Ming Wen, Nasim Nezamoddini
Abstract:
Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM
Procedia PDF Downloads 1126024 Neural Correlates of Diminished Humor Comprehension in Schizophrenia: A Functional Magnetic Resonance Imaging Study
Authors: Przemysław Adamczyk, Mirosław Wyczesany, Aleksandra Domagalik, Artur Daren, Kamil Cepuch, Piotr Błądziński, Tadeusz Marek, Andrzej Cechnicki
Abstract:
The present study aimed at evaluation of neural correlates of humor comprehension impairments observed in schizophrenia. To investigate the nature of this deficit in schizophrenia and to localize cortical areas involved in humor processing we used functional magnetic resonance imaging (fMRI). The study included chronic schizophrenia outpatients (SCH; n=20), and sex, age and education level matched healthy controls (n=20). The task consisted of 60 stories (setup) of which 20 had funny, 20 nonsensical and 20 neutral (not funny) punchlines. After the punchlines were presented, the participants were asked to indicate whether the story was comprehensible (yes/no) and how funny it was (1-9 Likert-type scale). fMRI was performed on a 3T scanner (Magnetom Skyra, Siemens) using 32-channel head coil. Three contrasts in accordance with the three stages of humor processing were analyzed in both groups: abstract vs neutral stories - incongruity detection; funny vs abstract - incongruity resolution; funny vs neutral - elaboration. Additionally, parametric modulation analysis was performed using both subjective ratings separately in order to further differentiate the areas involved in incongruity resolution processing. Statistical analysis for behavioral data used U Mann-Whitney test and Bonferroni’s correction, fMRI data analysis utilized whole-brain voxel-wise t-tests with 10-voxel extent threshold and with Family Wise Error (FWE) correction at alpha = 0.05, or uncorrected at alpha = 0.001. Between group comparisons revealed that the SCH subjects had attenuated activation in: the right superior temporal gyrus in case of irresolvable incongruity processing of nonsensical puns (nonsensical > neutral); the left medial frontal gyrus in case of incongruity resolution processing of funny puns (funny > nonsensical) and the interhemispheric ACC in case of elaboration of funny puns (funny > neutral). Additionally, the SCH group revealed weaker activation during funniness ratings in the left ventro-medial prefrontal cortex, the medial frontal gyrus, the angular and the supramarginal gyrus, and the right temporal pole. In comprehension ratings the SCH group showed suppressed activity in the left superior and medial frontal gyri. Interestingly, these differences were accompanied by protraction of time in both types of rating responses in the SCH group, a lower level of comprehension for funny punchlines and a higher funniness for absurd punchlines. Presented results indicate that, in comparison to healthy controls, schizophrenia is characterized by difficulties in humor processing revealed by longer reaction times, impairments of understanding jokes and finding nonsensical punchlines more funny. This is accompanied by attenuated brain activations, especially in the left fronto-parietal and the right temporal cortices. Disturbances of the humor processing seem to be impaired at the all three stages of the humor comprehension process, from incongruity detection, through its resolution to elaboration. The neural correlates revealed diminished neural activity of the schizophrenia brain, as compared with the control group. The study was supported by the National Science Centre, Poland (grant no 2014/13/B/HS6/03091).Keywords: communication skills, functional magnetic resonance imaging, humor, schizophrenia
Procedia PDF Downloads 2136023 Design of a Graphical User Interface for Data Preprocessing and Image Segmentation Process in 2D MRI Images
Authors: Enver Kucukkulahli, Pakize Erdogmus, Kemal Polat
Abstract:
The 2D image segmentation is a significant process in finding a suitable region in medical images such as MRI, PET, CT etc. In this study, we have focused on 2D MRI images for image segmentation process. We have designed a GUI (graphical user interface) written in MATLABTM for 2D MRI images. In this program, there are two different interfaces including data pre-processing and image clustering or segmentation. In the data pre-processing section, there are median filter, average filter, unsharp mask filter, Wiener filter, and custom filter (a filter that is designed by user in MATLAB). As for the image clustering, there are seven different image segmentations for 2D MR images. These image segmentation algorithms are as follows: PSO (particle swarm optimization), GA (genetic algorithm), Lloyds algorithm, k-means, the combination of Lloyds and k-means, mean shift clustering, and finally BBO (Biogeography Based Optimization). To find the suitable cluster number in 2D MRI, we have designed the histogram based cluster estimation method and then applied to these numbers to image segmentation algorithms to cluster an image automatically. Also, we have selected the best hybrid method for each 2D MR images thanks to this GUI software.Keywords: image segmentation, clustering, GUI, 2D MRI
Procedia PDF Downloads 377