Search results for: medical image processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8695

Search results for: medical image processing

6985 Protection of a Doctor’s Reputation Against the Unjustified Medical Malpractice Allegations

Authors: Anna Wszołek

Abstract:

For a very long time, the doctor-patient relationship had a paternalistic character. The events of the II World War, as well as fast development of the biotechnology and medicine caused an important change in that relationship. Human beings and their dignity were put in the centre of philosophical and legal debate. The increasing frequency of clinical trials led to the emergence of bioethics, which dealt with the topic of the possibilities and boundaries of such research in relation to individual’s autonomy. Thus, there was a transformation from a paternalistic relationship to a more collaborative one in which the patient has more room for self-determination. Today, patients are more and more aware of their rights and the obligations placed on doctors and the health care system, which is linked to an increase in medical malpractice claims. Unfortunately, these claims are not always justified. There is a strong concentration around the topic of patient’s good, however, at the other side there are doctors who feel, on the example of Poland, they might be easily accused and sued for medical malpractice even though they fulfilled their duties. Such situation may have a negative impact on the quality of health care services and patient’s interests. This research is going to present doctor’s perspective on the topic of medical malpractice allegations. It is supposed to show possible damage to a doctor’s reputation caused by frivolous and weakly justified medical malpractice accusations, as well as means to protect this reputation.

Keywords: doctor's reputation, medical malpractice, personal rights, unjustified allegations

Procedia PDF Downloads 81
6984 Determines the Continuity of Void in Underground Mine Tunnel Using Ground Penetrating Radar

Authors: Farid Adisaputra Gumilang

Abstract:

Kucing Liar Underground Mine is a future mine of PT Freeport Indonesia PTFI that is currently being developed. In the development process, problems were found when blasting the tunnels; there were overbreak, and void occur caused by geological contact or poor rock conditions. Geotechnical engineers must evaluate not only the remnant capacity of ground support systems but also investigate the depth of rock mass yield within pillars. To prevent the potential hazard caused by void zones, geotechnical engineers must ensure the planned drift is mined in the best location where people can work safely. GPR, or Ground penetrating radar, is a geophysical method that can image the subsurface. This non-destructive method uses electromagnetic radiation and detects the reflected signals from subsurface structures. The GPR survey measurements are conducted 48 meters along the drift that has a poor ground condition with 150MHz antenna with several angles (roof, wall, and floor). Concern grounds are determined by the continuity of reflector/low reflector in the radargram section. Concern grounds are determined by the continuity of reflector/low reflector in the radargram section. In this paper, processing data using instantaneous amplitude to identify the void zone. In order to have a good interpretation and result, it combines with the geological information and borehole camera data, so the calibrated GPR data allows the geotechnical engineer to determine the safe location to change the drift location.

Keywords: underground mine, ground penetrating radar, reflectivity, borehole camera

Procedia PDF Downloads 61
6983 Processing Studies and Challenges Faced in Development of High-Pressure Titanium Alloy Cryogenic Gas Bottles

Authors: Bhanu Pant, Sanjay H. Upadhyay

Abstract:

Frequently, the upper stage of high-performance launch vehicles utilizes cryogenic tank-submerged pressurization gas bottles with high volume-to-weight efficiency to achieve a direct gain in the satellite payload. Titanium alloys, owing to their high specific strength coupled with excellent compatibility with various fluids, are the materials of choice for these applications. Amongst the Titanium alloys, there are two alloys suitable for cryogenic applications, namely Ti6Al4V-ELI and Ti5Al2.5Sn-ELI. The two-phase alpha-beta alloy Ti6Al4V-ELI is usable up to LOX temperature of 90K, while the single-phase alpha alloy Ti5Al2.5Sn-ELI can be used down to LHe temperature of 4 K. The high-pressure gas bottles submerged in the LH2 (20K) can store more amount of gas in as compared to those submerged in LOX (90K) bottles the same volume. Thus, the use of these alpha alloy gas bottles stored at 20K gives a distinct advantage with respect to the need for a lesser number of gas bottles to store the same amount of high-pressure gas, which in turn leads to a one-to-one advantage in the payload in the satellite. The cost advantage to the tune of 15000$/ kg of weight is saved in the upper stages, and, thereby, the satellite payload gain is expected by this change. However, the processing of alpha Ti5Al2.5Sn-ELI alloy gas bottles poses challenges due to the lower forgeability of the alloy and mode of qualification for the critical severe application environment. The present paper describes the processing and challenges/ solutions during the development of these advanced gas bottles for LH2 (20K) applications.

Keywords: titanium alloys, cryogenic gas bottles, alpha titanium alloy, alpha-beta titanium alloy

Procedia PDF Downloads 39
6982 Global Health, Humanitarian Medical Aid, and the Ethics of Rationing

Authors: N. W. Paul, S. Michl

Abstract:

In our globalized world we need to appreciate the fact that questions of health and justice need to be addressed on a global scale, too. The way in which diverse governmental and non-governmental initiatives are trying to answer the need for humanitarian medical aid has long since been a visible result of globalized responsibility. While the intention of humanitarian medical aids seems to be evident, the allocation of resources has become more and more an ethical and societal challenge. With a rising number and growing dimension of humanitarian catastrophes around the globe the search for ethically justifiable ways to decide who might benefit from limited resources has become a pressing question. Rooted in theories of justice (Rawls) and concepts of social welfare (Sen) we developed and implemented a model for an ethically sound distribution of a limited annual budget for humanitarian care in one of the largest medical universities of Germany. Based on our long lasting experience with civil casualties of war (Afghanistan) and civil war (Libya) as well as with under- and uninsured and/or stateless patients we are now facing the on-going refugee crisis as our most recent challenge in terms of global health and justice. Against this background, the paper strives to a) explain key issues of humanitarian medical aid in the 21st century, b) explore the problem of rationing from an ethical point of view, c) suggest a tool for the rational allocation of scarce resources in humanitarian medical aid, d) present actual cases of humanitarian care that have been managed with our toolbox, and e) discuss the international applicability of our model beyond local contexts.

Keywords: humanitarian care, medical ethics, allocation, rationing

Procedia PDF Downloads 385
6981 The Effect of Fly Ash in Dewatering of Marble Processing Wastewaters

Authors: H. A. Taner, V. Önen

Abstract:

In the thermal power plants established to meet the energy need, lignite with low calorie and high ash content is used. Burning of these coals results in wastes such as fly ash, slag and flue gas. This constitutes a significant economic and environmental problems. However, fly ash can find evaluation opportunities in various sectors. In this study, the effectiveness of fly ash on suspended solid removal from marble processing wastewater containing high concentration of suspended solids was examined. Experiments were carried out for two different suspensions, marble and travertine. In the experiments, FeCl3, Al2(SO4)3 and anionic polymer A130 were used also to compare with fly ash. Coagulant/flocculant type/dosage, mixing time/speed and pH were the experimental parameters. The performances in the experimental studies were assessed with the change in the interface height during sedimentation resultant and turbidity values of treated water. The highest sedimentation efficiency was achieved with anionic flocculant. However, it was determined that fly ash can be used instead of FeCl3 and Al2(SO4)3 in the travertine plant as a coagulant.

Keywords: dewatering, flocculant, fly ash, marble plant wastewater

Procedia PDF Downloads 140
6980 Amplifying Sine Unit-Convolutional Neural Network: An Efficient Deep Architecture for Image Classification and Feature Visualizations

Authors: Jamshaid Ul Rahman, Faiza Makhdoom, Dianchen Lu

Abstract:

Activation functions play a decisive role in determining the capacity of Deep Neural Networks (DNNs) as they enable neural networks to capture inherent nonlinearities present in data fed to them. The prior research on activation functions primarily focused on the utility of monotonic or non-oscillatory functions, until Growing Cosine Unit (GCU) broke the taboo for a number of applications. In this paper, a Convolutional Neural Network (CNN) model named as ASU-CNN is proposed which utilizes recently designed activation function ASU across its layers. The effect of this non-monotonic and oscillatory function is inspected through feature map visualizations from different convolutional layers. The optimization of proposed network is offered by Adam with a fine-tuned adjustment of learning rate. The network achieved promising results on both training and testing data for the classification of CIFAR-10. The experimental results affirm the computational feasibility and efficacy of the proposed model for performing tasks related to the field of computer vision.

Keywords: amplifying sine unit, activation function, convolutional neural networks, oscillatory activation, image classification, CIFAR-10

Procedia PDF Downloads 86
6979 Iot-Based Interactive Patient Identification and Safety Management System

Authors: Jonghoon Chun, Insung Kim, Jonghyun Lim, Gun Ro

Abstract:

We believe that it is possible to provide a solution to reduce patient safety accidents by displaying correct medical records and prescription information through interactive patient identification. Our system is based on the use of smart bands worn by patients and these bands communicate with the hybrid gateways which understand both BLE and Wifi communication protocols. Through the convergence of low-power Bluetooth (BLE) and hybrid gateway technology, which is one of short-range wireless communication technologies, we implement ‘Intelligent Patient Identification and Location Tracking System’ to prevent medical malfunction frequently occurring in medical institutions. Based on big data and IOT technology using MongoDB, smart band (BLE, NFC function) and hybrid gateway, we develop a system to enable two-way communication between medical staff and hospitalized patients as well as to store locational information of the patients in minutes. Based on the precise information provided using big data systems, such as location tracking and movement of in-hospital patients wearing smart bands, our findings include the fact that a patient-specific location tracking algorithm can more efficiently operate HIS (Hospital Information System) and other related systems. Through the system, we can always correctly identify patients using identification tags. In addition, the system automatically determines whether the patient is a scheduled for medical service by the system in use at the medical institution, and displays the appropriateness of the medical treatment and the medical information (medical record and prescription information) on the screen and voice. This work was supported in part by the Korea Technology and Information Promotion Agency for SMEs (TIPA) grant funded by the Korean Small and Medium Business Administration (No. S2410390).

Keywords: BLE, hybrid gateway, patient identification, IoT, safety management, smart band

Procedia PDF Downloads 299
6978 Multivariate Analysis of Spectroscopic Data for Agriculture Applications

Authors: Asmaa M. Hussein, Amr Wassal, Ahmed Farouk Al-Sadek, A. F. Abd El-Rahman

Abstract:

In this study, a multivariate analysis of potato spectroscopic data was presented to detect the presence of brown rot disease or not. Near-Infrared (NIR) spectroscopy (1,350-2,500 nm) combined with multivariate analysis was used as a rapid, non-destructive technique for the detection of brown rot disease in potatoes. Spectral measurements were performed in 565 samples, which were chosen randomly at the infection place in the potato slice. In this study, 254 infected and 311 uninfected (brown rot-free) samples were analyzed using different advanced statistical analysis techniques. The discrimination performance of different multivariate analysis techniques, including classification, pre-processing, and dimension reduction, were compared. Applying a random forest algorithm classifier with different pre-processing techniques to raw spectra had the best performance as the total classification accuracy of 98.7% was achieved in discriminating infected potatoes from control.

Keywords: Brown rot disease, NIR spectroscopy, potato, random forest

Procedia PDF Downloads 174
6977 Retrospective Study for Elective Medical Patients Evacuation of Different Diagnoses Requiring Different Approach in Oxygen Usage

Authors: Branimir Skoric

Abstract:

Over the past two decades, number of international travels rose significantly in the United Kingdom and Worldwide in the shape of business travels and holiday travels as well. The fact that elderly people travel a lot, more than ever before increased the needs for medical evacuations (repatriations) back home if they fell ill abroad or had any kind of accident. This paper concerns medical evacuations of patients on the way back home to the United Kingdom (United Kingdom Residents) and their specific medical needs during short-haul or long-haul commercial scheduled flight and ground transportation to the final destination regardless whether it was hospital or usual place of residence. Particular medical need during medical evacuations is oxygen supply and it can be supplied via portable oxygen concentrator, pulse flow oxygenator or continuous free flow oxygenator depending on the main diagnosis and patient’s comorbidities. In this retrospective study, patients were divided into two groups. One group was consisted of patients suffering from cardio-respiratory diagnoses as primary illness. Another Group consisted of patients suffering from noncardiac illnesses who have other problems including any kind of physical injury. Needs for oxygen and type of supply were carefully considered in regards of duration of the flight, standard airline cabin pressure and results described in this retrospective study.

Keywords: commercial flight, elderly travellers, medical evacuations, oxygen

Procedia PDF Downloads 134
6976 Making Waves: Preparing the Next Generation of Bilingual Medical Doctors

Authors: Edith Esparza-Young, Ángel M. Matos, Yaritza Gonzalez, Kirthana Sugunathevan

Abstract:

Introduction: This research describes the existing medical school program which supports a multicultural setting and bilingualism. The rise of Spanish speakers in the United States has led to the recruitment of bilingual medical students who can serve the evolving demographics. This paper includes anecdotal evidence, narratives and the latest research on the outcomes of supporting a multilingual academic experience in medical school and beyond. People in the United States will continue to need health care from physicians who have experience with multicultural competence. Physicians who are bilingual and possess effective communication skills will be in high demand. Methodologies: This research is descriptive. Through this descriptive research, the researcher will describe the qualities and characteristics of the existing medical school programs, curriculum, and student services. Additionally, the researcher will shed light on the existing curriculum in the medical school and also describe specific programs which help to serve as safety nets to support diverse populations. The method included observations of the existing program and the implementation of the medical school program, specifically the Accelerated Review Program, the Language Education and Professional Communication Program, student organizations and the Global Health Institute. Concluding Statement: This research identified and described characteristics of the medical school’s program. The research explained and described the current and present phenomenon of this medical program, which has focused on increasing the graduation of bilingual and minority physicians. The findings are based on observations of the curriculum, programs and student organizations which evolves and remains innovative to stay current with student enrollment.

Keywords: bilingual, English, medicine, doctor

Procedia PDF Downloads 125
6975 Big Data Analysis with Rhipe

Authors: Byung Ho Jung, Ji Eun Shin, Dong Hoon Lim

Abstract:

Rhipe that integrates R and Hadoop environment made it possible to process and analyze massive amounts of data using a distributed processing environment. In this paper, we implemented multiple regression analysis using Rhipe with various data sizes of actual data. Experimental results for comparing the performance of our Rhipe with stats and biglm packages available on bigmemory, showed that our Rhipe was more fast than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases. We also compared the computing speeds of pseudo-distributed and fully-distributed modes for configuring Hadoop cluster. The results showed that fully-distributed mode was faster than pseudo-distributed mode, and computing speeds of fully-distributed mode were faster as the number of data nodes increases.

Keywords: big data, Hadoop, Parallel regression analysis, R, Rhipe

Procedia PDF Downloads 487
6974 Temporal Progression of Episodic Memory as Function of Encoding Condition and Age: Further Investigation of Action Memory in School-Aged Children

Authors: Farzaneh Badinlou, Reza Kormi-Nouri, Monika Knopf

Abstract:

Studies of adults' episodic memory have found that enacted encoding not only improve recall performance but also retrieve faster during the recall period. The current study focused on exploring the temporal progression of different encoding conditions in younger and older school children. 204 students from two age group of 8 and 14 participated in this study. During the study phase, we studied action encoding in two forms; participants performed the phrases by themselves (SPT), and observed the performance of the experimenter (EPT), which were compared with verbal encoding; participants listened to verbal action phrases (VT). At test phase, we used immediate and delayed free recall tests. We observed significant differences in memory performance as function of age group, and encoding conditions in both immediate and delayed free recall tests. Moreover, temporal progression of recall was faster in older children when compared with younger ones. The interaction of age-group and encoding condition was only significant in delayed recall displaying that younger children performed better in EPT whereas older children outperformed in SPT. It was proposed that enactment effect in form of SPT enhances item-specific processing, whereas EPT improves relational information processing and this differential processes are responsible for the results achieved in younger and older children. The role of memory strategies and information processing methods in younger and older children were considered in this study. Moreover, the temporal progression of recall was faster in action encoding in the form of SPT and EPT compared with verbal encoding in both immediate and delayed free recall and size of enactment effect was constantly increased throughout the recall period. The results of the present study provide further evidence that the action memory is explained with an emphasis on the notion of information processing and strategic views. These results also reveal the temporal progression of recall as a new dimension of episodic memory in children.

Keywords: action memory, enactment effect, episodic memory, school-aged children, temporal progression

Procedia PDF Downloads 257
6973 Controlling the Process of a Chicken Dressing Plant through Statistical Process Control

Authors: Jasper Kevin C. Dionisio, Denise Mae M. Unsay

Abstract:

In a manufacturing firm, controlling the process ensures that optimum efficiency, productivity, and quality in an organization are achieved. An operation with no standardized procedure yields a poor productivity, inefficiency, and an out of control process. This study focuses on controlling the small intestine processing of a chicken dressing plant through the use of Statistical Process Control (SPC). Since the operation does not employ a standard procedure and does not have an established standard time, the process through the assessment of the observed time of the overall operation of small intestine processing, through the use of X-Bar R Control Chart, is found to be out of control. In the solution of this problem, the researchers conduct a motion and time study aiming to establish a standard procedure for the operation. The normal operator was picked through the use of Westinghouse Rating System. Instead of utilizing the traditional motion and time study, the researchers used the X-Bar R Control Chart in determining the process average of the process that is used for establishing the standard time. The observed time of the normal operator was noted and plotted to the X-Bar R Control Chart. Out of control points that are due to assignable cause were removed and the process average, or the average time the normal operator conducted the process, which was already in control and free form any outliers, was obtained. The process average was then used in determining the standard time of small intestine processing. As a recommendation, the researchers suggest the implementation of the standard time established which is with consonance to the standard procedure which was adopted from the normal operator. With that recommendation, the whole operation will induce a 45.54 % increase in their productivity.

Keywords: motion and time study, process controlling, statistical process control, X-Bar R Control chart

Procedia PDF Downloads 200
6972 Influence of Optical Fluence Distribution on Photoacoustic Imaging

Authors: Mohamed K. Metwally, Sherif H. El-Gohary, Kyung Min Byun, Seung Moo Han, Soo Yeol Lee, Min Hyoung Cho, Gon Khang, Jinsung Cho, Tae-Seong Kim

Abstract:

Photoacoustic imaging (PAI) is a non-invasive and non-ionizing imaging modality that combines the absorption contrast of light with ultrasound resolution. Laser is used to deposit optical energy into a target (i.e., optical fluence). Consequently, the target temperature rises, and then thermal expansion occurs that leads to generating a PA signal. In general, most image reconstruction algorithms for PAI assume uniform fluence within an imaging object. However, it is known that optical fluence distribution within the object is non-uniform. This could affect the reconstruction of PA images. In this study, we have investigated the influence of optical fluence distribution on PA back-propagation imaging using finite element method. The uniform fluence was simulated as a triangular waveform within the object of interest. The non-uniform fluence distribution was estimated by solving light propagation within a tissue model via Monte Carlo method. The results show that the PA signal in the case of non-uniform fluence is wider than the uniform case by 23%. The frequency spectrum of the PA signal due to the non-uniform fluence has missed some high frequency components in comparison to the uniform case. Consequently, the reconstructed image with the non-uniform fluence exhibits a strong smoothing effect.

Keywords: finite element method, fluence distribution, Monte Carlo method, photoacoustic imaging

Procedia PDF Downloads 368
6971 A Control Model for the Dismantling of Industrial Plants

Authors: Florian Mach, Eric Hund, Malte Stonis

Abstract:

The dismantling of disused industrial facilities such as nuclear power plants or refineries is an enormous challenge for the planning and control of the logistic processes. Existing control models do not meet the requirements for a proper dismantling of industrial plants. Therefore, the paper presents an approach for the control of dismantling and post-processing processes (e.g. decontamination) in plant decommissioning. In contrast to existing approaches, the dismantling sequence and depth are selected depending on the capacity utilization of required post-processing processes by also considering individual characteristics of respective dismantling tasks (e.g. decontamination success rate, uncertainties regarding the process times). The results can be used in the dismantling of industrial plants (e.g. nuclear power plants) to reduce dismantling time and costs by avoiding bottlenecks such as capacity constraints.

Keywords: dismantling management, logistics planning and control models, nuclear power plant dismantling, reverse logistics

Procedia PDF Downloads 290
6970 Anonymity and Irreplaceability: Gross Anatomical Practices in Japanese Medical Education

Authors: Ayami Umemura

Abstract:

Without exception, all the bodies dissected in the gross anatomical practices are bodies that have lived irreplaceable lives, laughing and talking with family and friends. While medical education aims to cultivate medical knowledge that is universally applicable to all human bodies, it relies on a unique, irreplaceable, and singular entity. In this presentation, we will explore the ``irreplaceable relationship'' that is cultivated between medical students and anonymous cadavers during gross anatomical practices, drawing on Emmanuel Levinas's ``ethics of the face'' and Martin Buber's discussion of “I-Thou.'' Through this, we aim to present ``a different ethic'' that emerges only in the context of face-to-face relationships, which differs from the generalized, institutionalized, mass-produced ethics like seen in so-called ``ethics codes.'' Since the 1990s, there has been a movement around the world to use gross anatomical practices as an "educational tool" for medical professionalism and medical ethics, and some educational institutions have started disclosing the actual names, occupations, and places of birth of corpses to medical students. These efforts have also been criticized because they lack medical calmness. In any case, the issue here is that this information is all about the past that medical students never know directly. The critical fact that medical students are building relationships from scratch and spending precious time together without any information about the corpses before death is overlooked. Amid gross anatomical practices, a medical student is exposed to anonymous cadavers with faces and touching and feeling them. In this presentation, we will examine a collection of essays written by medical students on gross anatomical practices collected by the Japanese Association for Volunteer Body Donation from medical students across the country since 1978. There, we see the students calling out to the corpse, being called out to, being encouraged, superimposing the carcasses on their own immediate family, regretting parting, and shedding tears. Then, medical students can be seen addressing the dead body in the second person singular, “you.” These behaviors reveal an irreplaceable relationship between the anonymous cadavers and the medical students. The moment they become involved in an irreplaceable relationship between “I and you,” an accidental and anonymous encounter becomes inevitable. When medical students notice being the inevitable takers of voluntary and addressless gifts, they pledge to become “Good Doctors” owing the anonymous persons. This presentation aims to present “a different ethic” based on uniqueness and irreplaceability that comes from the faces of the others embedded in each context, which is different from “routine” and “institutionalized” ethics. That can only be realized ``because of anonymity''.

Keywords: anonymity, irreplaceability, uniqueness, singularlity, emanuel levinas, martin buber, alain badiou, medical education

Procedia PDF Downloads 48
6969 Modelling and Simulation of Single Mode Optical Fiber Directional Coupler for Medical Application

Authors: Shilpa Kulkarni, Sujata Patrikar

Abstract:

A single-mode fiber directional coupler is modeled and simulated for its application in medical field. Various fiber devices based on evanescent field absorption, interferometry, couplers, resonators, tip coated fibers, etc, have been developed so far, suitable for medical application. This work focuses on the possibility of sensing by single mode fiber directional coupler. In the preset work, a fiber directional coupler is modeled to detect the changes taking place in the surrounding medium optoelectronically. In this work, waveguiding characteristics of the fiber are studied in depth. The sensor is modeled and simulated by finding photocurrent, sensitivity and detection limit by varying various parameters of the directional coupler. The device is optimized for the best possible output. It is found that the directional coupler shows measurable photocurrents and good sensitivity with coupling length in micrometers. It is thus a miniature device, hence, suitable for medical applications.

Keywords: single mode fiber directional coupler, modeling and simulation of fiber directional coupler sensor, biomolecular sensing, medical sensor device

Procedia PDF Downloads 253
6968 The Impact of Regulatory Changes on the Development of Mobile Medical Apps

Authors: M. McHugh, D. Lillis

Abstract:

Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.

Keywords: agile, applications, FDA, medical, mobile, regulations, software engineering, standards

Procedia PDF Downloads 348
6967 Administrators' Information Management Capacity and Decision-Making Effectiveness on Staff Promotion in the Teaching Service Commissions in South – West, Nigeria

Authors: Olatunji Sabitu Alimi

Abstract:

This study investigated the extent to which administrators’ information storage, retrieval and processing capacities influence decisions on staff promotion in the Teaching Service Commissions (TESCOMs) in The South-West, Nigeria. One research question and two research hypotheses were formulated and tested respectively at 0.05 level of significance. The study used the descriptive research of the survey type. One hundred (100) staff on salary grade level 09 constituted the sample. Multi- stage, stratified and simple random sampling techniques were used to select 100 staff from the TESCOMs in The South-West, Nigeria. Two questionnaires titled Administrators’ Information Storage, Retrieval and Processing Capacities (AISRPC), and Staff Promotion Effectiveness (SPE) were used for data collection. The inventory was validated and subjected to test-re-test and reliability coefficient of r = 0.79 was obtained. The data were collected and analyzed using Pearson Product Moment Correlation coefficient and simple percentage. The study found that Administrators at TESCOM stored their information in files, hard copies, soft copies, open registry and departmentally in varying degrees while they also processed information manually and through electronics for decision making. In addition, there is a significant relationship between administrators’ information storage and retrieval capacities in the TESCOMs in South – West, Nigeria, (r cal = 0.598 > r table = 0.195). Furthermore, administrators’ information processing capacity and staff promotion effectiveness were found to be significantly related (r cal = 0.209 > r table = 0.195 at 0.05 level of significance). The study recommended that training, seminars, workshops should be organized for administrators on information management, while educational organizations should provide Information Management Technology (ICT) equipment for the administrators in the TESCOMs. The staff of TESCOM should be promoted having satisfied the promotion criteria such as spending required number of years on a grade level, a clean record of service and vacancy.

Keywords: information processing capacity, staff promotion effectiveness, teaching service commission, Nigeria

Procedia PDF Downloads 519
6966 Identifying, Reporting and Preventing Medical Errors Among Nurses Working in Critical Care Units At Kenyatta National Hospital, Kenya: Closing the Gap Between Attitude and Practice

Authors: Jared Abuga, Wesley Too

Abstract:

Medical error is the third leading cause of death in US, with approximately 98,000 deaths occurring every year as a result of medical errors. The world financial burden of medication errors is roughly USD 42 billion. Medication errors may lead to at least one death daily and injure roughly 1.3 million people every year. Medical error reporting is essential in creating a culture of accountability in our healthcare system. Studies have shown that attitudes and practice of healthcare workers in reporting medical errors showed that the major factors in under-reporting of errors included work stress and fear of medico-legal consequences due to the disclosure of error. Further, the majority believed that increase in reporting medical errors would contribute to a better system. Most hospitals depend on nurses to discover medication errors because they are considered to be the sources of these errors, as contributors or mere observers, consequently, the nurse’s perception of medication errors and what needs to be done is a vital feature to reducing incidences of medication errors. We sought to explore knowledge among nurses on medical errors and factors affecting or hindering reporting of medical errors among nurses working at the emergency unit, KNH. Critical care nurses are faced with many barriers to completing incident reports on medication errors. One of these barriers which contribute to underreporting is a lack of education and/or knowledge regarding medication errors and the reporting process. This study, therefore, sought to determine the availability and the use of reporting systems for medical errors in critical care unity. It also sought to establish nurses’ perception regarding medical errors and reporting and document factors facilitating timely identification and reporting of medical errors in critical care settings. Methods: The study used cross-section study design to collect data from 76 critical care nurses from Kenyatta Teaching & Research National Referral Hospital, Kenya. Data analysis and results is ongoing. By October 2022, we will have analysis, results, discussions, and recommendations of the study for purposes of the conference in 2023

Keywords: errors, medical, kenya, nurses, safety

Procedia PDF Downloads 226
6965 A Cost Effective Approach to Develop Mid-Size Enterprise Software Adopted the Waterfall Model

Authors: Mohammad Nehal Hasnine, Md Kamrul Hasan Chayon, Md Mobasswer Rahman

Abstract:

Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.

Keywords: end-user application development, enterprise software design, information resource management, usability

Procedia PDF Downloads 423
6964 Reconstruction of Visual Stimuli Using Stable Diffusion with Text Conditioning

Authors: ShyamKrishna Kirithivasan, Shreyas Battula, Aditi Soori, Richa Ramesh, Ramamoorthy Srinath

Abstract:

The human brain, among the most complex and mysterious aspects of the body, harbors vast potential for extensive exploration. Unraveling these enigmas, especially within neural perception and cognition, delves into the realm of neural decoding. Harnessing advancements in generative AI, particularly in Visual Computing, seeks to elucidate how the brain comprehends visual stimuli observed by humans. The paper endeavors to reconstruct human-perceived visual stimuli using Functional Magnetic Resonance Imaging (fMRI). This fMRI data is then processed through pre-trained deep-learning models to recreate the stimuli. Introducing a new architecture named LatentNeuroNet, the aim is to achieve the utmost semantic fidelity in stimuli reconstruction. The approach employs a Latent Diffusion Model (LDM) - Stable Diffusion v1.5, emphasizing semantic accuracy and generating superior quality outputs. This addresses the limitations of prior methods, such as GANs, known for poor semantic performance and inherent instability. Text conditioning within the LDM's denoising process is handled by extracting text from the brain's ventral visual cortex region. This extracted text undergoes processing through a Bootstrapping Language-Image Pre-training (BLIP) encoder before it is injected into the denoising process. In conclusion, a successful architecture is developed that reconstructs the visual stimuli perceived and finally, this research provides us with enough evidence to identify the most influential regions of the brain responsible for cognition and perception.

Keywords: BLIP, fMRI, latent diffusion model, neural perception.

Procedia PDF Downloads 56
6963 Dematerialized Beings in Katherine Dunn's Geek Love: A Corporeal and Ethical Study under Posthumanities

Authors: Anum Javed

Abstract:

This study identifies the dynamical image of human body that continues its metamorphosis in the virtual field of reality. It calls attention to the ways where humans start co-evolving with other life forms; technology in particular and are striving to establish a realm outside the physical framework of matter. The problem exceeds the area of technological ethics by explicably and explanatorily entering the space of literary texts and criticism. Textual analysis of Geek Love (1989) by Katherine Dunn is adjoined with posthumanist perspectives of Pramod K. Nayar to beget psycho-somatic changes in man’s nature of being. It uncovers the meaning people give to their experiences in this budding social and cultural phenomena of material representation tied up with personal practices and technological innovations. It also observes an ethical, physical and psychological reassessment of man within the context of technological evolutions. The study indicates the elements that have rendered morphological freedom and new materialism in man’s consciousness. Moreover this work is inquisitive of what it means to be a human in this time of accelerating change where surgeries, implants, extensions, cloning and robotics have shaped a new sense of being. It attempts to go beyond individual’s body image and explores how objectifying media and culture have influenced people’s judgement of others on new material grounds. It further argues a decentring of the glorified image of man as an independent entity because of his energetic partnership with intelligent machines and external agents. The history of the future progress of technology is also mentioned. The methodology adopted is posthumanist techno-ethical textual analysis. This work necessitates a negotiating relationship between man and technology in order to achieve harmonic and balanced interconnected existence. The study concludes by recommending a call for an ethical set of codes to be cultivated for the techno-human habituation. Posthumanism ushers a strong need of adopting new ethics within the terminology of neo-materialist humanism.

Keywords: corporeality, dematerialism, human ethos, posthumanism

Procedia PDF Downloads 129
6962 Prediction of Product Size Distribution of a Vertical Stirred Mill Based on Breakage Kinetics

Authors: C. R. Danielle, S. Erik, T. Patrick, M. Hugh

Abstract:

In the last decade there has been an increase in demand for fine grinding due to the depletion of coarse-grained orebodies and an increase of processing fine disseminated minerals and complex orebodies. These ores have provided new challenges in concentrator design because fine and ultra-fine grinding is required to achieve acceptable recovery rates. Therefore, the correct design of a grinding circuit is important for minimizing unit costs and increasing product quality. The use of ball mills for grinding in fine size ranges is inefficient and, therefore, vertical stirred grinding mills are becoming increasingly popular in the mineral processing industry due to its already known high energy efficiency. This work presents a hypothesis of a methodology to predict the product size distribution of a vertical stirred mill using a Bond ball mill. The Population Balance Model (PBM) was used to empirically analyze the performance of a vertical mill and a Bond ball mill. The breakage parameters obtained for both grinding mills are compared to determine the possibility of predicting the product size distribution of a vertical mill based on the results obtained from the Bond ball mill. The biggest advantage of this methodology is that most of the minerals processing laboratories already have a Bond ball mill to perform the tests suggested in this study. Preliminary results show the possibility of predicting the performance of a laboratory vertical stirred mill using a Bond ball mill.

Keywords: bond ball mill, population balance model, product size distribution, vertical stirred mill

Procedia PDF Downloads 276
6961 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D models, environment, matching, pleiades

Procedia PDF Downloads 315
6960 Challenging Weak Central Coherence: An Exploration of Neurological Evidence from Visual Processing and Linguistic Studies in Autism Spectrum Disorder

Authors: Jessica Scher Lisa, Eric Shyman

Abstract:

Autism spectrum disorder (ASD) is a neuro-developmental disorder that is characterized by persistent deficits in social communication and social interaction (i.e. deficits in social-emotional reciprocity, nonverbal communicative behaviors, and establishing/maintaining social relationships), as well as by the presence of repetitive behaviors and perseverative areas of interest (i.e. stereotyped or receptive motor movements, use of objects, or speech, rigidity, restricted interests, and hypo or hyperactivity to sensory input or unusual interest in sensory aspects of the environment). Additionally, diagnoses of ASD require the presentation of symptoms in the early developmental period, marked impairments in adaptive functioning, and a lack of explanation by general intellectual impairment or global developmental delay (although these conditions may be co-occurring). Over the past several decades, many theories have been developed in an effort to explain the root cause of ASD in terms of atypical central cognitive processes. The field of neuroscience is increasingly finding structural and functional differences between autistic and neurotypical individuals using neuro-imaging technology. One main area this research has focused upon is in visuospatial processing, with specific attention to the notion of ‘weak central coherence’ (WCC). This paper offers an analysis of findings from selected studies in order to explore research that challenges the ‘deficit’ characterization of a weak central coherence theory as opposed to a ‘superiority’ characterization of strong local coherence. The weak central coherence theory has long been both supported and refuted in the ASD literature and has most recently been increasingly challenged by advances in neuroscience. The selected studies lend evidence to the notion of amplified localized perception rather than deficient global perception. In other words, WCC may represent superiority in ‘local processing’ rather than a deficit in global processing. Additionally, the right hemisphere and the specific area of the extrastriate appear to be key in both the visual and lexicosemantic process. Overactivity in the striate region seems to suggest inaccuracy in semantic language, which lends itself to support for the link between the striate region and the atypical organization of the lexicosemantic system in ASD.

Keywords: autism spectrum disorder, neurology, visual processing, weak coherence

Procedia PDF Downloads 111
6959 Internet Memes: A Mirror of Culture and Society

Authors: Alexandra-Monica Toma

Abstract:

As the internet became a ruling force of society, computer-mediated communication has enriched its methods to convey meaning by combining linguistic means to visual means of expressivity. One of the elements of cyberspace is what we call a meme, a succinct, visually engaging tool used to communicate ideas or emotions, usually in a funny or ironic manner. Coined by Richard Dawkings in the late 1970s to refer to cultural genes, this term now denominates a special type of vernacular language used to share content on the internet. This research aims to analyse the basic mechanism that stands at the basis of meme creation as a blend of innovation and imitation and will approach some of the most widely used image macros remixed to generate new content while also pointing out success strategies. Moreover, this paper discusses whether memes can transcend the light-hearted and playful mood they mirror and become biting and sharp cultural comments. The study also uses the concept of multimodality and stresses how the text interacts with image, discussing three types of relations between the two: symmetry, amplification, and contradiction. We will furthermore show that memes are cultural artifacts and virtual tropes highly dependent on context and societal issues by using a corpus of memes created related to the COVID-19 pandemic.

Keywords: context, computer-mediated communication, memes, multimodality

Procedia PDF Downloads 171
6958 A Relative Entropy Regularization Approach for Fuzzy C-Means Clustering Problem

Authors: Ouafa Amira, Jiangshe Zhang

Abstract:

Clustering is an unsupervised machine learning technique; its aim is to extract the data structures, in which similar data objects are grouped in the same cluster, whereas dissimilar objects are grouped in different clusters. Clustering methods are widely utilized in different fields, such as: image processing, computer vision , and pattern recognition, etc. Fuzzy c-means clustering (fcm) is one of the most well known fuzzy clustering methods. It is based on solving an optimization problem, in which a minimization of a given cost function has been studied. This minimization aims to decrease the dissimilarity inside clusters, where the dissimilarity here is measured by the distances between data objects and cluster centers. The degree of belonging of a data point in a cluster is measured by a membership function which is included in the interval [0, 1]. In fcm clustering, the membership degree is constrained with the condition that the sum of a data object’s memberships in all clusters must be equal to one. This constraint can cause several problems, specially when our data objects are included in a noisy space. Regularization approach took a part in fuzzy c-means clustering technique. This process introduces an additional information in order to solve an ill-posed optimization problem. In this study, we focus on regularization by relative entropy approach, where in our optimization problem we aim to minimize the dissimilarity inside clusters. Finding an appropriate membership degree to each data object is our objective, because an appropriate membership degree leads to an accurate clustering result. Our clustering results in synthetic data sets, gaussian based data sets, and real world data sets show that our proposed model achieves a good accuracy.

Keywords: clustering, fuzzy c-means, regularization, relative entropy

Procedia PDF Downloads 251
6957 A Network of Nouns and Their Features :A Neurocomputational Study

Authors: Skiker Kaoutar, Mounir Maouene

Abstract:

Neuroimaging studies indicate that a large fronto-parieto-temporal network support nouns and their features, with some areas store semantic knowledge (visual, auditory, olfactory, gustatory,…), other areas store lexical representation and other areas are implicated in general semantic processing. However, it is not well understood how this fronto-parieto-temporal network can be modulated by different semantic tasks and different semantic relations between nouns. In this study, we combine a behavioral semantic network, functional MRI studies involving object’s related nouns and brain network studies to explain how different semantic tasks and different semantic relations between nouns can modulate the activity within the brain network of nouns and their features. We first describe how nouns and their features form a large scale brain network. For this end, we examine the connectivities between areas recruited during the processing of nouns to know which configurations of interaction areas are possible. We can thus identify if, for example, brain areas that store semantic knowledge communicate via functional/structural links with areas that store lexical representations. Second, we examine how this network is modulated by different semantic tasks involving nouns and finally, we examine how category specific activation may result from the semantic relations among nouns. The results indicate that brain network of nouns and their features is highly modulated and flexible by different semantic tasks and semantic relations. At the end, this study can be used as a guide to help neurosientifics to interpret the pattern of fMRI activations detected in the semantic processing of nouns. Specifically; this study can help to interpret the category specific activations observed extensively in a large number of neuroimaging studies and clinical studies.

Keywords: nouns, features, network, category specificity

Procedia PDF Downloads 503
6956 Phonological Processing and Its Role in Pseudo-Word Decoding in Children Learning to Read Kannada Language between 5.6 to 8.6 Years

Authors: Vangmayee. V. Subban, Somashekara H. S, Shwetha Prabhu, Jayashree S. Bhat

Abstract:

Introduction and Need: Phonological processing is critical in learning to read alphabetical and non-alphabetical languages. However, its role in learning to read Kannada an alphasyllabary is equivocal. The literature has focused on the developmental role of phonological awareness on reading. To the best of authors knowledge, the role of phonological memory and phonological naming has not been addressed in alphasyllabary Kannada language. Therefore, there is a need to evaluate the comprehensive role of the phonological processing skills in Kannada on word decoding skills during the early years of schooling. Aim and Objectives: The present study aimed to explore the phonological processing abilities and their role in learning to decode pseudowords in children learning to read the Kannada language during initial years of formal schooling between 5.6 to 8.6 years. Method: In this cross sectional study, 60 typically developing Kannada speaking children, 20 each from Grade I, Grade II, and Grade III between the age range of 5.6 to 6.6 years, 6.7 to 7.6 years and 7.7 to 8.6 years respectively were selected from Kannada medium schools. Phonological processing abilities were assessed using an assessment tool specifically developed to address the objectives of the present research. The assessment tool was content validated by subject experts and had good inter and intra-subject reliability. Phonological awareness was assessed at syllable level using syllable segmentation, blending, and syllable stripping at initial, medial and final position. Phonological memory was assessed using pseudoword repetition task and phonological naming was assessed using rapid automatized naming of objects. Both phonological awareneness and phonological memory measures were scored for the accuracy of the response, whereas Rapid Automatized Naming (RAN) was scored for total naming speed. Results: The mean scores comparison using one-way ANOVA revealed a significant difference (p ≤ 0.05) between the groups on all the measures of phonological awareness, pseudoword repetition, rapid automatized naming, and pseudoword reading. Subsequent post-hoc grade wise comparison using Bonferroni test revealed significant differences (p ≤ 0.05) between each of the grades for all the tasks except (p ≥ 0.05) for syllable blending, syllable stripping, and pseudoword repetition between Grade II and Grade III. The Pearson correlations revealed a highly significant positive correlation (p=0.000) between all the variables except phonological naming which had significant negative correlations. However, the correlation co-efficient was higher for phonological awareness measures compared to others. Hence, phonological awareness was chosen a first independent variable to enter in the hierarchical regression equation followed by rapid automatized naming and finally, pseudoword repetition. The regression analysis revealed syllable awareness as a single most significant predictor of pseudoword reading by explaining the unique variance of 74% and there was no significant change in R² when RAN and pseudoword repetition were added subsequently to the regression equation. Conclusion: Present study concluded that syllable awareness matures completely by Grade II, whereas the phonological memory and phonological naming continue to develop beyond Grade III. Amongst phonological processing skills, phonological awareness, especially syllable awareness is crucial for word decoding than phonological memory and naming during initial years of schooling.

Keywords: phonological awareness, phonological memory, phonological naming, phonological processing, pseudo-word decoding

Procedia PDF Downloads 156