Search results for: Computer self-efficacy
1008 A PIM (Processor-In-Memory) for Computer Graphics : Data Partitioning and Placement Schemes
Authors: Jae Chul Cha, Sandeep K. Gupta
Abstract:
The demand for higher performance graphics continues to grow because of the incessant desire towards realism. And, rapid advances in fabrication technology have enabled us to build several processor cores on a single die. Hence, it is important to develop single chip parallel architectures for such data-intensive applications. In this paper, we propose an efficient PIM architectures tailored for computer graphics which requires a large number of memory accesses. We then address the two important tasks necessary for maximally exploiting the parallelism provided by the architecture, namely, partitioning and placement of graphic data, which affect respectively load balances and communication costs. Under the constraints of uniform partitioning, we develop approaches for optimal partitioning and placement, which significantly reduce search space. We also present heuristics for identifying near-optimal placement, since the search space for placement is impractically large despite our optimization. We then demonstrate the effectiveness of our partitioning and placement approaches via analysis of example scenes; simulation results show considerable search space reductions, and our heuristics for placement performs close to optimal – the average ratio of communication overheads between our heuristics and the optimal was 1.05. Our uniform partitioning showed average load-balance ratio of 1.47 for geometry processing and 1.44 for rasterization, which is reasonable.Keywords: Data Partitioning and Placement, Graphics, PIM, Search Space Reduction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14921007 Real Time Acquisition and Analysis of Neural Response for Rehabilitative Control
Authors: Dipali Bansal, Rashima Mahajan, Shweta Singh, Dheeraj Rathee, Sujit Roy
Abstract:
Non-invasive Brain Computer Interface like Electroencephalography (EEG) which directly taps neurological signals, is being widely explored these days to connect paralytic patients/elderly with the external environment. However, in India the research is confined to laboratory settings and is not reaching the mass for rehabilitation purposes. An attempt has been made in this paper to analyze real time acquired EEG signal using cost effective and portable headset unit EMOTIV. Signal processing of real time acquired EEG is done using EEGLAB in MATLAB and EDF Browser application software platforms. Independent Component Analysis algorithm of EEGLAB is explored to identify deliberate eye blink in the attained neural signal. Time Frequency transforms and Data statistics obtained using EEGLAB along with component activation results of EDF browser clearly indicate voluntary eye blink in AF3 channel. The spectral analysis indicates dominant frequency component at 1.536000Hz representing the delta wave component of EEG during voluntary eye blink action. An algorithm is further designed to generate an active high signal based on thoughtful eye blink that can be used for plethora of control applications for rehabilitation.
Keywords: Brain Computer Interface, EDF Browser, EEG, EEGLab, EMOTIV, Real time Acquisition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32351006 Integration of Image and Patient Data, Software and International Coding Systems for Use in a Mammography Research Project
Authors: V. Balanica, W. I. D. Rae, M. Caramihai, S. Acho, C. P. Herbst
Abstract:
Mammographic images and data analysis to facilitate modelling or computer aided diagnostic (CAD) software development should best be done using a common database that can handle various mammographic image file formats and relate these to other patient information. This would optimize the use of the data as both primary reporting and enhanced information extraction of research data could be performed from the single dataset. One desired improvement is the integration of DICOM file header information into the database, as an efficient and reliable source of supplementary patient information intrinsically available in the images. The purpose of this paper was to design a suitable database to link and integrate different types of image files and gather common information that can be further used for research purposes. An interface was developed for accessing, adding, updating, modifying and extracting data from the common database, enhancing the future possible application of the data in CAD processing. Technically, future developments envisaged include the creation of an advanced search function to selects image files based on descriptor combinations. Results can be further used for specific CAD processing and other research. Design of a user friendly configuration utility for importing of the required fields from the DICOM files must be done.Keywords: Database Integration, Mammogram Classification, Tumour Classification, Computer Aided Diagnosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19441005 Robot Control by ERPs of Brain Waves
Authors: K. T. Sun, Y. H. Tai, H. W. Yang, H. T. Lin
Abstract:
This paper presented the technique of robot control by event-related potentials (ERPs) of brain waves. Based on the proposed technique, severe physical disabilities can free browse outside world. A specific component of ERPs, N2P3, was found and used to control the movement of robot and the view of camera on the designed brain-computer interface (BCI). Users only required watching the stimuli of attended button on the BCI, the evoked potentials of brain waves of the target button, N2P3, had the greatest amplitude among all control buttons. An experimental scene had been constructed that the robot required walking to a specific position and move the view of camera to see the instruction of the mission, and then completed the task. Twelve volunteers participated in this experiment, and experimental results showed that the correct rate of BCI control achieved 80% and the average of execution time was 353 seconds for completing the mission. Four main contributions included in this research: (1) find an efficient component of ERPs, N2P3, for BCI control, (2) embed robot's viewpoint image into user interface for robot control, (3) design an experimental scene and conduct the experiment, and (4) evaluate the performance of the proposed system for assessing the practicability.
Keywords: Brain-computer interface (BCI), event-related potentials (ERPs), robot control, severe physical disabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25981004 The Practice of Teaching Chemistry by the Application of Online Tests
Authors: Nikolina Ribarić
Abstract:
E-learning is most commonly defined as a set of applications and processes, such as Web-based learning, computer-based learning, virtual classrooms and digital collaboration, that enable access to instructional content through a variety of electronic media. The main goal of an e-learning system is learning, and the way to evaluate the impact of an e-learning system is by examining whether students learn effectively with the help of that system. Testmoz is a program for online preparation of knowledge evaluation assignments. The program provides teachers with computer support during the design of assignments and evaluating them. Students can review and solve assignments and also check the correctness of their solutions. Research into the increase of motivation by the practice of providing teaching content by applying online tests prepared in the Testmoz program, was carried out with students of the 8th grade of Ljubo Babić Primary School in Jastrebarsko. The students took the tests in their free time, from home, for an unlimited number of times. SPSS was used to process the data obtained by the research instruments. The results of the research showed that students preferred to practice teaching content, and achieved better educational results in chemistry, when they had access to online tests for repetition and practicing in relation to subject content which was checked after repetition and practicing in "the classical way" – i.e., solving assignments in a workbook or writing assignments in worksheets.
Keywords: Chemistry class, e-learning, online test, Testmoz.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5621003 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference
Authors: Hussein Alahmer, Amr Ahmed
Abstract:
Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate. This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.
Keywords: CAD system, difference of feature, Fuzzy c means, Liver segmentation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14211002 Information Requirements for Vessel Traffic Service Operations
Authors: Fan Li, Chun-Hsien Chen, Li Pheng Khoo
Abstract:
Operators of vessel traffic service (VTS) center provides three different types of services; namely information service, navigational assistance and traffic organization to vessels. To provide these services, operators monitor vessel traffic through computer interface and provide navigational advice based on the information integrated from multiple sources, including automatic identification system (AIS), radar system, and closed circuit television (CCTV) system. Therefore, this information is crucial in VTS operation. However, what information the VTS operator actually need to efficiently and properly offer services is unclear. The aim of this study is to investigate into information requirements for VTS operation. To achieve this aim, field observation was carried out to elicit the information requirements for VTS operation. The study revealed that the most frequent and important tasks were handling arrival vessel report, potential conflict control and abeam vessel report. Current location and vessel name were used in all tasks. Hazard cargo information was particularly required when operators handle arrival vessel report. The speed, the course, and the distance of two or several vessels were only used in potential conflict control. The information requirements identified in this study can be utilized in designing a human-computer interface that takes into consideration what and when information should be displayed, and might be further used to build the foundation of a decision support system for VTS.
Keywords: Vessel traffic service, information requirements, hierarchy task analysis, field observation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15911001 Automatic Detection of Defects in Ornamental Limestone Using Wavelets
Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas
Abstract:
A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.
Keywords: Automatic detection, wavelets, defects, fracture lines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11661000 Hybrid Rocket Motor Performance Parameters: Theoretical and Experimental Evaluation
Authors: A. El-S. Makled, M. K. Al-Tamimi
Abstract:
A mathematical model to predict the performance parameters (thrusts, chamber pressures, fuel mass flow rates, mixture ratios, and regression rates during firing time) of hybrid rocket motor (HRM) is evaluated. The internal ballistic (IB) hybrid combustion model assumes that the solid fuel surface regression rate is controlled only by heat transfer (convective and radiative) from flame zone to solid fuel burning surface. A laboratory HRM is designed, manufactured, and tested for low thrust profile space missions (10-15 N) and for validating the mathematical model (computer program). The polymer material and gaseous oxidizer which are selected for this experimental work are polymethyle-methacrylate (PMMA) and polyethylene (PE) as solid fuel grain and gaseous oxygen (GO2) as oxidizer. The variation of various operational parameters with time is determined systematically and experimentally in firing of up to 20 seconds, and an average combustion efficiency of 95% of theory is achieved, which was the goal of these experiments. The comparison between recording fire data and predicting analytical parameters shows good agreement with the error that does not exceed 4.5% during all firing time. The current mathematical (computer) code can be used as a powerful tool for HRM analytical design parameters.Keywords: Hybrid combustion, internal ballistics, hybrid rocket motor, performance parameters.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1770999 Trajectory Guided Recognition of Hand Gestures having only Global Motions
Authors: M. K. Bhuyan, P. K. Bora, D. Ghosh
Abstract:
One very interesting field of research in Pattern Recognition that has gained much attention in recent times is Gesture Recognition. In this paper, we consider a form of dynamic hand gestures that are characterized by total movement of the hand (arm) in space. For these types of gestures, the shape of the hand (palm) during gesturing does not bear any significance. In our work, we propose a model-based method for tracking hand motion in space, thereby estimating the hand motion trajectory. We employ the dynamic time warping (DTW) algorithm for time alignment and normalization of spatio-temporal variations that exist among samples belonging to the same gesture class. During training, one template trajectory and one prototype feature vector are generated for every gesture class. Features used in our work include some static and dynamic motion trajectory features. Recognition is accomplished in two stages. In the first stage, all unlikely gesture classes are eliminated by comparing the input gesture trajectory to all the template trajectories. In the next stage, feature vector extracted from the input gesture is compared to all the class prototype feature vectors using a distance classifier. Experimental results demonstrate that our proposed trajectory estimator and classifier is suitable for Human Computer Interaction (HCI) platform.
Keywords: Hand gesture, human computer interaction, key video object plane, dynamic time warping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2741998 TheAnalyzer: Clustering-Based System for Improving Business Productivity by Analyzing User Profiles to Enhance Human-Computer Interaction
Authors: D. S. A. Nanayakkara, K. J. P. G. Perera
Abstract:
E-commerce platforms have revolutionized the shopping experience, offering convenient ways for consumers to make purchases. To improve interactions with customers and optimize marketing strategies, it is essential for businesses to understand user behavior, preferences, and needs on these platforms. This paper focuses on recommending businesses to customize interactions with users based on their behavioral patterns, leveraging data-driven analysis and machine learning techniques. Businesses can improve engagement and boost the adoption of e-commerce platforms by aligning behavioral patterns with user goals of usability and satisfaction. We propose TheAnalyzer, a clustering-based system designed to enhance business productivity by analyzing user-profiles and improving human-computer interaction. TheAnalyzer seamlessly integrates with business applications, collecting relevant data points based on users' natural interactions without additional burdens such as questionnaires or surveys. It defines five key user analytics as features for its dataset, which are easily captured through users' interactions with e-commerce platforms. This research presents a study demonstrating the successful distinction of users into specific groups based on the five key analytics considered by TheAnalyzer. With the assistance of domain experts, customized business rules can be attached to each group, enabling TheAnalyzer to influence business applications and provide an enhanced personalized user experience. The outcomes are evaluated quantitatively and qualitatively, demonstrating that utilizing TheAnalyzer’s capabilities can optimize business outcomes, enhance customer satisfaction, and drive sustainable growth. The findings of this research contribute to the advancement of personalized interactions in e-commerce platforms. By leveraging user behavioral patterns and analyzing both new and existing users, businesses can effectively tailor their interactions to improve customer satisfaction, loyalty and ultimately drive sales.
Keywords: Data clustering, data standardization, dimensionality reduction, human-computer interaction, user profiling.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 228997 An E-Maintenance IoT Sensor Node Designed for Fleets of Diverse Heavy-Duty Vehicles
Authors: George Charkoftakis, Panagiotis Liosatos, Nicolas-Alexander Tatlas, Dimitrios Goustouridis, Stelios M. Potirakis
Abstract:
E-maintenance is a relatively recent concept, generally referring to maintenance management by monitoring assets over the Internet. One of the key links in the chain of an e-maintenance system is data acquisition and transmission. Specifically for the case of a fleet of heavy-duty vehicles, where the main challenge is the diversity of the vehicles and vehicle-embedded self-diagnostic/reporting technologies, the design of the data acquisition and transmission unit is a demanding task. This is clear if one takes into account that a heavy-vehicles fleet assortment may range from vehicles with only a limited number of analog sensors monitored by dashboard light indicators and gauges to vehicles with plethora of sensors monitored by a vehicle computer producing digital reporting. The present work proposes an adaptable internet of things (IoT) sensor node that is capable of addressing this challenge. The proposed sensor node architecture is based on the increasingly popular single-board computer – expansion boards approach. In the proposed solution, the expansion boards undertake the tasks of position identification, cellular connectivity, connectivity to the vehicle computer, and connectivity to analog and digital sensors by means of a specially targeted design of expansion board. Specifically, the latter offers a number of adaptability features to cope with the diverse sensor types employed in different vehicles. In standard mode, the IoT sensor node communicates to the data center through cellular network, transmitting all digital/digitized sensor data, IoT device identity and position. Moreover, the proposed IoT sensor node offers connectivity, through WiFi and an appropriate application, to smart phones or tablets allowing the registration of additional vehicle- and driver-specific information and these data are also forwarded to the data center. All control and communication tasks of the IoT sensor node are performed by dedicated firmware.
Keywords: IoT sensor nodes, e-maintenance, single-board computers, sensor expansion boards, on-board diagnostics
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 594996 Surface Flattening Assisted with 3D Mannequin Based On Minimum Energy
Authors: Shih-Wen Hsiao, Rong-Qi Chen, Chien-Yu Lin
Abstract:
The topic of surface flattening plays a vital role in the field of computer aided design and manufacture. Surface flattening enables the production of 2D patterns and it can be used in design and manufacturing for developing a 3D surface to a 2D platform, especially in fashion design. This study describes surface flattening based on minimum energy methods according to the property of different fabrics. Firstly, through the geometric feature of a 3D surface, the less transformed area can be flattened on a 2D platform by geodesic. Then, strain energy that has accumulated in mesh can be stably released by an approximate implicit method and revised error function. In some cases, cutting mesh to further release the energy is a common way to fix the situation and enhance the accuracy of the surface flattening, and this makes the obtained 2D pattern naturally generate significant cracks. When this methodology is applied to a 3D mannequin constructed with feature lines, it enhances the level of computer-aided fashion design. Besides, when different fabrics are applied to fashion design, it is necessary to revise the shape of a 2D pattern according to the properties of the fabric. With this model, the outline of 2D patterns can be revised by distributing the strain energy with different results according to different fabric properties. Finally, this research uses some common design cases to illustrate and verify the feasibility of this methodology.
Keywords: Surface flattening, Strain energy, Minimum energy, approximate implicit method, Fashion design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2598995 Action Potential Propagation in Inhomogeneous 2D Mouse Ventricular Tissue Model
Authors: Mouse, cardiac myocytes, computer simulation, action potential.
Abstract:
Heterogeneous repolarization causes dispersion of the T-wave and has been linked to arrhythmogenesis. Such heterogeneities appear due to differential expression of ionic currents in different regions of the heart, both in healthy and diseased animals and humans. Mice are important animals for the study of heart diseases because of the ability to create transgenic animals. We used our previously reported model of mouse ventricular myocytes to develop 2D mouse ventricular tissue model consisting of 14,000 cells (apical or septal ventricular myocytes) and to study the stability of action potential propagation and Ca2+ dynamics. The 2D tissue model was implemented as a FORTRAN program code for highperformance multiprocessor computers that runs on 36 processors. Our tissue model is able to simulate heterogeneities not only in action potential repolarization, but also heterogeneities in intracellular Ca2+ transients. The multicellular model reproduced experimentally observed velocities of action potential propagation and demonstrated the importance of incorporation of realistic Ca2+ dynamics for action potential propagation. The simulations show that relatively sharp gradients of repolarization are predicted to exist in 2D mouse tissue models, and they are primarily determined by the cellular properties of ventricular myocytes. Abrupt local gradients of channel expression can cause alternans at longer pacing basic cycle lengths than gradual changes, and development of alternans depends on the site of stimulation.
Keywords: Mouse, cardiac myocytes, computer simulation, action potential
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1472994 Buckling of Plates on Foundation with Different Types of Sides Support
Authors: Ali N. Suri, Ahmad A. Al-Makhlufi
Abstract:
In this paper the problem of buckling of plates on foundation of finite length and with different side support is studied.
The Finite Strip Method is used as tool for the analysis. This method uses finite strip elastic, foundation, and geometric matrices to build the assembly matrices for the whole structure, then after introducing boundary conditions at supports, the resulting reduced matrices is transformed into a standard Eigenvalue-Eigenvector problem. The solution of this problem will enable the determination of the buckling load, the associated buckling modes and the buckling wave length.
To carry out the buckling analysis starting from the elastic, foundation, and geometric stiffness matrices for each strip a computer program FORTRAN list is developed.
Since stiffness matrices are function of wave length of buckling, the computer program used an iteration procedure to find the critical buckling stress for each value of foundation modulus and for each boundary condition.
The results showed the use of elastic medium to support plates subject to axial load increase a great deal the buckling load, the results found are very close with those obtained by other analytical methods and experimental work.
The results also showed that foundation compensates the effect of the weakness of some types of constraint of side support and maximum benefit found for plate with one side simply supported the other free.
Keywords: Buckling, Finite Strip, Different Sides Support, Plates on Foundation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2147993 Computer Simulation of Low Volume Roads Made from Recycled Materials
Authors: Aleš Florian, Lenka Ševelová
Abstract:
Low volume roads are widely used all over the world. To improve their quality the computer simulation of their behavior is proposed. The FEM model enables to determine stress and displacement conditions in the pavement and/or also in the particular material layers. Different variants of pavement layers, material used, humidity as well as loading conditions can be studied. Among others, the input information about material properties of individual layers made from recycled materials is crucial for obtaining results as exact as possible. For this purpose the cyclic-load triaxial test machine testing of cyclic-load performance of materials is a promising test method. The test is able to simulate the real traffic loading on particular materials taking into account the changes in the horizontal stress conditions produced in particular layers by crossings of vehicles. Also the test specimen can be prepared with different amount of water. Thus modulus of elasticity (Young modulus) of different materials including recycled ones can be measured under the different conditions of horizontal and vertical stresses as well as under the different humidity conditions. Using the proposed testing procedure the modulus of elasticity of recycled materials used in the newly built low volume road is obtained under different stress and humidity conditions set to standard, dry and fully saturated level. Obtained values of modulus of elasticity are used in FEA.
Keywords: FEA, FEM, geotechnical materials, low volume roads, pavement, triaxial test, Young modulus.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1625992 Optimal Sliding Mode Controller for Knee Flexion During Walking
Authors: Gabriel Sitler, Yousef Sardahi, Asad Salem
Abstract:
This paper presents an optimal and robust sliding mode controller (SMC) to regulate the position of the knee joint angle for patients suffering from knee injuries. The controller imitates the role of active orthoses that produce the joint torques required to overcome gravity and loading forces and regain natural human movements. To this end, a mathematical model of the shank, the lower part of the leg, is derived first and then used for the control system design and computer simulations. The design of the controller is carried out in optimal and multi-objective settings. Four objectives are considered: minimization of the control effort and tracking error; and maximization of the control signal smoothness and closed-loop system’s speed of response. Optimal solutions in terms of the Pareto set and its image, the Pareto front, are obtained. The results show that there are trade-offs among the design objectives and many optimal solutions from which the decision-maker can choose to implement. Also, computer simulations conducted at different points from the Pareto set and assuming knee squat movement demonstrate competing relationships among the design goals. In addition, the proposed control algorithm shows robustness in tracking a standard gait signal when accounting for uncertainty in the shank’s parameters.
Keywords: Optimal control, multi-objective optimization, sliding mode control, wearable knee exoskeletons.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 181991 Application of Scientific Metrics to Evaluate Academic Reputation in Different Research Areas
Authors: Cristiano R. Cervi, Renata Galante, José Palazzo M. de Oliveira
Abstract:
In this paper, we address the problem of identifying academic reputation of researchers using scientific metrics in different research areas. Due to the characteristics of each area, researchers can present different behaviors. In previous work, we define Rep-Index that makes use of a profile template to individually identify the reputation of researchers. The Rep-Index is comprehensive and adaptive because involves hole trajectory of the researcher built throughout his career and can be used in different areas and in different contexts. Now, we compare our metric (Rep-Index) with the h-index and the g-index through experiments with researchers in the fields of Economics, Dentistry and Computer Science. We analyze the trajectory of 830 Brazilian researchers from the National Council of Technological and Scientific Development (CNPq), which receive grants research productivity. The grants are aimed at productivity researchers that stand out among their peers, enhancing their scientific normative criteria established by CNPq. Of the 830 researchers, 210 are in the area of Economics, 216 of Dentistry e 404 of Computer Science. The experiments show that our metric is strongly correlated with h-index, g-index and CNPq ranking. We also show good results for our hypothesis that our metric can be used to evaluate research in several areas. We apply our metric (Rep-Index) to compare the behavior of researchers in relation to their h-index and g-index through extensive experiments. The experiments showed that our metric is strongly correlated with h-index, g-index and CNPq ranking.
Keywords: Researcher reputation, profile model, scientific metrics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1999990 A Cascaded Fuzzy Inference System for Dynamic Online Portals Customization
Authors: Erika Martinez Ramirez, Rene V. Mayorga
Abstract:
In our modern world, more physical transactions are being substituted by electronic transactions (i.e. banking, shopping, and payments), many businesses and companies are performing most of their operations through the internet. Instead of having a physical commerce, internet visitors are now adapting to electronic commerce (e-Commerce). The ability of web users to reach products worldwide can be greatly benefited by creating friendly and personalized online business portals. Internet visitors will return to a particular website when they can find the information they need or want easily. Dealing with this human conceptualization brings the incorporation of Artificial/Computational Intelligence techniques in the creation of customized portals. From these techniques, Fuzzy-Set technologies can make many useful contributions to the development of such a human-centered endeavor as e-Commerce. The main objective of this paper is the implementation of a Paradigm for the Intelligent Design and Operation of Human-Computer interfaces. In particular, the paradigm is quite appropriate for the intelligent design and operation of software modules that display information (such Web Pages, graphic user interfaces GUIs, Multimedia modules) on a computer screen. The human conceptualization of the user personal information is analyzed throughout a Cascaded Fuzzy Inference (decision-making) System to generate the User Ascribe Qualities, which identify the user and that can be used to customize portals with proper Web links.
Keywords: Fuzzy Logic, Internet, Electronic Commerce, Intelligent Portals, Electronic Shopping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1787989 Sperm Identification Using Elliptic Model and Tail Detection
Authors: Vahid Reza Nafisi, Mohammad Hasan Moradi, Mohammad Hosain Nasr-Esfahani
Abstract:
The conventional assessment of human semen is a highly subjective assessment, with considerable intra- and interlaboratory variability. Computer-Assisted Sperm Analysis (CASA) systems provide a rapid and automated assessment of the sperm characteristics, together with improved standardization and quality control. However, the outcome of CASA systems is sensitive to the method of experimentation. While conventional CASA systems use digital microscopes with phase-contrast accessories, producing higher contrast images, we have used raw semen samples (no staining materials) and a regular light microscope, with a digital camera directly attached to its eyepiece, to insure cost benefits and simple assembling of the system. However, since the accurate finding of sperms in the semen image is the first step in the examination and analysis of the semen, any error in this step can affect the outcome of the analysis. This article introduces and explains an algorithm for finding sperms in low contrast images: First, an image enhancement algorithm is applied to remove extra particles from the image. Then, the foreground particles (including sperms and round cells) are segmented form the background. Finally, based on certain features and criteria, sperms are separated from other cells.Keywords: Computer-Assisted Sperm Analysis (CASA), Sperm identification, Tail detection, Elliptic shape model.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1927988 The Perception on 21st Century Skills of Nursing Instructors and Nursing Students at Boromarajonani College of Nursing, Chonburi
Authors: Kamolrat Turner, Somporn Rakkwamsuk, Ladda Leungratanamart
Abstract:
The aim of this descriptive study was to determine the perception of 21st century skills among nursing professors and nursing students at Boromarajonani College of Nursing, Chonburi. A total of 38 nursing professors and 75 second year nursing students took part in the study. Data were collected by 21st century skills questionnaires comprised of 63 items. Descriptive statistics were used to describe the findings. The results have shown that the overall mean scores of the perception of nursing professors on 21st century skills were at a high level. The highest mean scores were recorded for computing and ICT literacy, and career and leaning skills. The lowest mean scores were recorded for reading and writing and mathematics. The overall mean scores on perception of nursing students on 21st century skills were at a high level. The highest mean scores were recorded for computer and ICT literacy, for which the highest item mean scores were recorded for competency on computer programs. The lowest mean scores were recorded for the reading, writing, and mathematics components, in which the highest item mean score was reading Thai correctly, and the lowest item mean score was English reading and translate to other correctly. The findings from this study have shown that the perceptions of nursing professors were consistent with those of nursing students. Moreover, any activities aiming to raise capacity on English reading and translate information to others should be taken into the consideration.
Keywords: 21st century skills, perception, nursing instructor, nursing student.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1847987 Memorabilia of Suan Sunandha through Interactive User Interface
Authors: Nalinee Sophatsathit
Abstract:
The objectives of memorabilia of Suan Sunandha are to develop a general knowledge presentation about the historical royal garden through interactive graphic simulation technique and to employ high-functionality context in enhancing interactive user navigation. The approach infers non-intrusive display of relevant history in response to situational context. User’s navigation runs through the virtual reality campus, consisting of new and restored buildings. A flash back presentation of information pertaining to the history in the form of photos, paintings, and textual descriptions are displayed along each passing-by building. To keep the presentation lively, graphical simulation is created in a serendipity game play so that the user can both learn and enjoy the educational tour. The benefits of this human-computer interaction development are two folds. First, lively presentation technique and situational context modeling are developed that entail a usable paradigm of knowledge and information presentation combinations. Second, cost effective training and promotion for both internal personnel and public visitors to learn and keep informed of this historical royal garden can be furnished without the need for a dedicated public relations service. Future improvement on graphic simulation and ability based display can extend this work to be more realistic, user-friendly, and informative for all.
Keywords: Interactive user navigation, high-functionality context, situational context, human-computer interaction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1599986 A Cost Effective Approach to Develop Mid-size Enterprise Software Adopted the Waterfall Model
Authors: M. N. Hasnine, M. K. H. Chayon, M. M. Rahman
Abstract:
Organizational tendencies towards computer-based information processing have been observed noticeably in the third-world countries. Many enterprises are taking major initiatives towards computerized working environment because of massive benefits of computer-based information processing. However, designing and developing information resource management software for small and mid-size enterprises under budget costs and strict deadline is always challenging for software engineers. Therefore, we introduced an approach to design mid-size enterprise software by using the Waterfall model, which is one of the SDLC (Software Development Life Cycles), in a cost effective way. To fulfill research objectives, in this study, we developed mid-sized enterprise software named “BSK Management System” that assists enterprise software clients with information resource management and perform complex organizational tasks. Waterfall model phases have been applied to ensure that all functions, user requirements, strategic goals, and objectives are met. In addition, Rich Picture, Structured English, and Data Dictionary have been implemented and investigated properly in engineering manner. Furthermore, an assessment survey with 20 participants has been conducted to investigate the usability and performance of the proposed software. The survey results indicated that our system featured simple interfaces, easy operation and maintenance, quick processing, and reliable and accurate transactions.Keywords: End-user Application Development, Enterprise Software Design, Information Resource Management, Usability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1958985 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer
Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved
Abstract:
Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.
Keywords: Computer-aided system, detection, image segmentation, morphology.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 544984 Applicability of Overhangs for Energy Saving in Existing High-Rise Housing in Different Climates
Authors: Qiong He, S. Thomas Ng
Abstract:
Upgrading the thermal performance of building envelope of existing residential buildings is an effective way to reduce heat gain or heat loss. Overhang device is a common solution for building envelope improvement as it can cut down solar heat gain and thereby can reduce the energy used for space cooling in summer time. Despite that, overhang can increase the demand for indoor heating in winter due to its function of lowering the solar heat gain. Obviously, overhang has different impacts on energy use in different climatic zones which have different energy demand. To evaluate the impact of overhang device on building energy performance under different climates of China, an energy analysis model is built up in a computer-based simulation program known as DesignBuilder based on the data of a typical high-rise residential building. The energy simulation results show that single overhang is able to cut down around 5% of the energy consumption of the case building in the stand-alone situation or about 2% when the building is surrounded by other buildings in regions which predominantly rely on space cooling though it has no contribution to energy reduction in cold region. In regions with cold summer and cold winter, adding overhang over windows can cut down around 4% and 1.8% energy use with and without adjoining buildings, respectively. The results indicate that overhang might not an effective shading device to reduce the energy consumption in the mixed climate or cold regions.Keywords: Overhang, energy analysis, computer-based simulation, high-rise residential building, mutual shading, climate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1447983 Intelligent Assistive Methods for Diagnosis of Rheumatoid Arthritis Using Histogram Smoothing and Feature Extraction of Bone Images
Authors: SP. Chokkalingam, K. Komathy
Abstract:
Advances in the field of image processing envision a new era of evaluation techniques and application of procedures in various different fields. One such field being considered is the biomedical field for prognosis as well as diagnosis of diseases. This plethora of methods though provides a wide range of options to select from, it also proves confusion in selecting the apt process and also in finding which one is more suitable. Our objective is to use a series of techniques on bone scans, so as to detect the occurrence of rheumatoid arthritis (RA) as accurately as possible. Amongst other techniques existing in the field our proposed system tends to be more effective as it depends on new methodologies that have been proved to be better and more consistent than others. Computer aided diagnosis will provide more accurate and infallible rate of consistency that will help to improve the efficiency of the system. The image first undergoes histogram smoothing and specification, morphing operation, boundary detection by edge following algorithm and finally image subtraction to determine the presence of rheumatoid arthritis in a more efficient and effective way. Using preprocessing noises are removed from images and using segmentation, region of interest is found and Histogram smoothing is applied for a specific portion of the images. Gray level co-occurrence matrix (GLCM) features like Mean, Median, Energy, Correlation, Bone Mineral Density (BMD) and etc. After finding all the features it stores in the database. This dataset is trained with inflamed and noninflamed values and with the help of neural network all the new images are checked properly for their status and Rough set is implemented for further reduction.
Keywords: Computer Aided Diagnosis, Edge Detection, Histogram Smoothing, Rheumatoid Arthritis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2479982 Hardiness vs Alienation Personality Construct Essentially Explains Burnout Proclivity and Erroneous Computer Entry Problems in Rural Hellenic Hospital Labs
Authors: Angela–M. Paleologou, Aphrodite Dellaporta
Abstract:
Erroneous computer entry problems [here: 'e'errors] in hospital labs threaten the patients-–health carers- relationship, undermining the health system credibility. Are e-errors random, and do lab professionals make them accidentally, or may they be traced through meaningful determinants? Theories on internal causality of mistakes compel to seek specific causal ascriptions of hospital lab eerrors instead of accepting some inescapability. Undeniably, 'To Err is Human'. But in view of rapid global health organizational changes, e-errors are too expensive to lack in-depth considerations. Yet, that efunction might supposedly be entrenched in the health carers- job description remains under dispute – at least for Hellenic labs, where e-use falls behind generalized(able) appreciation and application. In this study: i) an empirical basis of a truly high annual cost of e-errors at about €498,000.00 per rural Hellenic hospital was established, hence interest in exploring the issue was sufficiently substantiated; ii) a sample of 270 lab-expert nurses, technicians and doctors were assessed on several personality, burnout and e-error measures, and iii) the hypothesis that the Hardiness vs Alienation personality construct disposition explains resistance vs proclivity to e-errors was tested and verified: Hardiness operates as a resilience source in the encounter of high pressures experienced in the hospital lab, whereas its 'opposite', i.e., Alienation, functions as a predictor, not only of making e-errors, but also of leading to burn-out. Implications for apt interventions are discussed.
Keywords: Hospital lab, personality hardiness/alienation, e-errors' cost, burnout.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1933981 Rotation Invariant Fusion of Partial Image Parts in Vista Creation using Missing View Regeneration
Authors: H. B. Kekre, Sudeep D. Thepade
Abstract:
The automatic construction of large, high-resolution image vistas (mosaics) is an active area of research in the fields of photogrammetry [1,2], computer vision [1,4], medical image processing [4], computer graphics [3] and biometrics [8]. Image stitching is one of the possible options to get image mosaics. Vista Creation in image processing is used to construct an image with a large field of view than that could be obtained with a single photograph. It refers to transforming and stitching multiple images into a new aggregate image without any visible seam or distortion in the overlapping areas. Vista creation process aligns two partial images over each other and blends them together. Image mosaics allow one to compensate for differences in viewing geometry. Thus they can be used to simplify tasks by simulating the condition in which the scene is viewed from a fixed position with single camera. While obtaining partial images the geometric anomalies like rotation, scaling are bound to happen. To nullify effect of rotation of partial images on process of vista creation, we are proposing rotation invariant vista creation algorithm in this paper. Rotation of partial image parts in the proposed method of vista creation may introduce some missing region in the vista. To correct this error, that is to fill the missing region further we have used image inpainting method on the created vista. This missing view regeneration method also overcomes the problem of missing view [31] in vista due to cropping, irregular boundaries of partial image parts and errors in digitization [35]. The method of missing view regeneration generates the missing view of vista using the information present in vista itself.Keywords: Vista, Overlap Estimation, Rotation Invariance, Missing View Regeneration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1722980 A Programming Assessment Software Artefact Enhanced with the Help of Learners
Authors: Romeo A. Botes, Imelda Smit
Abstract:
The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.
Keywords: Programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 994979 Central Finite Volume Methods Applied in Relativistic Magnetohydrodynamics: Applications in Disks and Jets
Authors: Raphael de Oliveira Garcia, Samuel Rocha de Oliveira
Abstract:
We have developed a new computer program in Fortran 90, in order to obtain numerical solutions of a system of Relativistic Magnetohydrodynamics partial differential equations with predetermined gravitation (GRMHD), capable of simulating the formation of relativistic jets from the accretion disk of matter up to his ejection. Initially we carried out a study on numerical methods of unidimensional Finite Volume, namely Lax-Friedrichs, Lax-Wendroff, Nessyahu-Tadmor method and Godunov methods dependent on Riemann problems, applied to equations Euler in order to verify their main features and make comparisons among those methods. It was then implemented the method of Finite Volume Centered of Nessyahu-Tadmor, a numerical schemes that has a formulation free and without dimensional separation of Riemann problem solvers, even in two or more spatial dimensions, at this point, already applied in equations GRMHD. Finally, the Nessyahu-Tadmor method was possible to obtain stable numerical solutions - without spurious oscillations or excessive dissipation - from the magnetized accretion disk process in rotation with respect to a central black hole (BH) Schwarzschild and immersed in a magnetosphere, for the ejection of matter in the form of jet over a distance of fourteen times the radius of the BH, a record in terms of astrophysical simulation of this kind. Also in our simulations, we managed to get substructures jets. A great advantage obtained was that, with the our code, we got simulate GRMHD equations in a simple personal computer.
Keywords: Finite Volume Methods, Central Schemes, Fortran 90, Relativistic Astrophysics, Jet.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2324