Search results for: behavioral-physical and visual methods
15501 Lessons Learned from Interlaboratory Noise Modelling in Scope of Environmental Impact Assessments in Slovenia
Abstract:
Noise assessment methods are regularly used in scope of Environmental Impact Assessments for planned projects to assess (predict) the expected noise emissions of these projects. Different noise assessment methods could be used. In recent years, we had an opportunity to collaborate in some noise assessment procedures where noise assessments of different laboratories have been performed simultaneously. We identified some significant differences in noise assessment results between laboratories in Slovenia. We estimate that despite good input Georeferenced Data to set up acoustic model exists in Slovenia; there is no clear consensus on methods for predictive noise methods for planned projects. We analyzed input data, methods and results of predictive noise methods for two planned industrial projects, both were done independently by two laboratories. We also analyzed the data, methods and results of two interlaboratory collaborative noise models for two existing noise sources (railway and motorway). In cases of predictive noise modelling, the validations of acoustic models were performed by noise measurements of surrounding existing noise sources, but in varying durations. The acoustic characteristics of existing buildings were also not described identically. The planned noise sources were described and digitized differently. Differences in noise assessment results between different laboratories have ranged up to 10 dBA, which considerably exceeds the acceptable uncertainty ranged between 3 to 6 dBA. Contrary to predictive noise modelling, in cases of collaborative noise modelling for two existing noise sources the possibility to perform the validation noise measurements of existing noise sources greatly increased the comparability of noise modelling results. In both cases of collaborative noise modelling for existing motorway and railway, the modelling results of different laboratories were comparable. Differences in noise modeling results between different laboratories were below 5 dBA, which was acceptable uncertainty set up by interlaboratory noise modelling organizer. The lessons learned from the study were: 1) Predictive noise calculation using formulae from International standard SIST ISO 9613-2: 1997 is not an appropriate method to predict noise emissions of planned projects since due to complexity of procedure they are not used strictly, 2) The noise measurements are important tools to minimize noise assessment errors of planned projects and should be in cases of predictive noise modelling performed at least for validation of acoustic model, 3) National guidelines should be made on the appropriate data, methods, noise source digitalization, validation of acoustic model etc. in order to unify the predictive noise models and their results in scope of Environmental Impact Assessments for planned projects.Keywords: environmental noise assessment, predictive noise modelling, spatial planning, noise measurements, national guidelines
Procedia PDF Downloads 23415500 An Educational Electronic Health Record with a Configurable User Interface
Authors: Floriane Shala, Evangeline Wagner, Yichun Zhao
Abstract:
Background: Proper educational training and support are proven to be major components of EHR (Electronic Health Record) implementation and use. However, the majority of health providers are not sufficiently trained in EHR use, leading to adverse events, errors, and decreased quality of care. In response to this, students studying Health Information Science, Public Health, Nursing, and Medicine should all gain a thorough understanding of EHR use at different levels for different purposes. The design of a usable and safe EHR system that accommodates the needs and workflows of different users, user groups, and disciplines is required for EHR learning to be efficient and effective. Objectives: This project builds several artifacts which seek to address both the educational and usability aspects of an educational EHR. The artifacts proposed are models for and examples of such an EHR with a configurable UI to be learned by students who need a background in EHR use during their degrees. Methods: Review literature and gather professional opinions from domain experts on usability, the use of workflow patterns, UI configurability and design, and the educational aspect of EHR use. Conduct interviews in a semi-casual virtual setting with open discussion in order to gain a deeper understanding of the principal aspects of EHR use in educational settings. Select a specific task and user group to illustrate how the proposed solution will function based on the current research. Develop three artifacts based on the available research, professional opinions, and prior knowledge of the topic. The artifacts capture the user task and user’s interactions with the EHR for learning. The first generic model provides a general understanding of the EHR system process. The second model is a specific example of performing the task of MRI ordering with a configurable UI. The third artifact includes UI mock-ups showcasing the models in a practical and visual way. Significance: Due to the lack of educational EHRs, medical professionals do not receive sufficient EHR training. Implementing an educational EHR with a usable and configurable interface to suit the needs of different user groups and disciplines will help facilitate EHR learning and training and ultimately improve the quality of patient care.Keywords: education, EHR, usability, configurable
Procedia PDF Downloads 15815499 French Language Teaching in Nigeria and Future with Technology
Authors: Chidiebere Samuel Ijeoma
Abstract:
The impact and importance of technology in all domains of existence cannot be overemphasized. It is like a double-edged sword which can be both constructive and destructive. The paper, therefore, tends to evaluate the impact of technology so far in the teaching and learning of French language in Nigeria. According to the study, the traditional methods of teaching French as a Foreign Language and recognized as our cultural methods of knowledge transfer are being fast replaced by digitalization in teaching. This, the research tends to portray and suggest the best way forward. In the Nigerian Primary Education System, the use of some local and cultural Instructional materials (teaching aids) is now almost history which the paper frowns at. Consequently, the study has these questions to ask?; Where are the chalks and blackboards? Where are the ‘Handworks’ (local brooms) submitted by school children as part of their Continuous Assessment? Finally, the research is in no way against the application of technology in the Nigerian French Language Teaching System but tries to draw a curtain between Technological methods of teaching French as a Foreign Language and the Original Nigerian System of teaching the language before the arrival of technology.Keywords: French language teaching, future, impact, importance of technology
Procedia PDF Downloads 35715498 Application of Artificial Neural Network in Assessing Fill Slope Stability
Authors: An-Jui. Li, Kelvin Lim, Chien-Kuo Chiu, Benson Hsiung
Abstract:
This paper details the utilization of artificial intelligence (AI) in the field of slope stability whereby quick and convenient solutions can be obtained using the developed tool. The AI tool used in this study is the artificial neural network (ANN), while the slope stability analysis methods are the finite element limit analysis methods. The developed tool allows for the prompt prediction of the safety factors of fill slopes and their corresponding probability of failure (depending on the degree of variation of the soil parameters), which can give the practicing engineer a reasonable basis in their decision making. In fact, the successful use of the Extreme Learning Machine (ELM) algorithm shows that slope stability analysis is no longer confined to the conventional methods of modeling, which at times may be tedious and repetitive during the preliminary design stage where the focus is more on cost saving options rather than detailed design. Therefore, similar ANN-based tools can be further developed to assist engineers in this aspect.Keywords: landslide, limit analysis, artificial neural network, soil properties
Procedia PDF Downloads 20915497 Regression Model Evaluation on Depth Camera Data for Gaze Estimation
Authors: James Purnama, Riri Fitri Sari
Abstract:
We investigate the machine learning algorithm selection problem in the term of a depth image based eye gaze estimation, with respect to its essential difficulty in reducing the number of required training samples and duration time of training. Statistics based prediction accuracy are increasingly used to assess and evaluate prediction or estimation in gaze estimation. This article evaluates Root Mean Squared Error (RMSE) and R-Squared statistical analysis to assess machine learning methods on depth camera data for gaze estimation. There are 4 machines learning methods have been evaluated: Random Forest Regression, Regression Tree, Support Vector Machine (SVM), and Linear Regression. The experiment results show that the Random Forest Regression has the lowest RMSE and the highest R-Squared, which means that it is the best among other methods.Keywords: gaze estimation, gaze tracking, eye tracking, kinect, regression model, orange python
Procedia PDF Downloads 53915496 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue
Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni
Abstract:
Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM
Procedia PDF Downloads 33415495 Information Literacy: Concept and Importance
Authors: Gaurav Kumar
Abstract:
An information literate person is one who uses information effectively in all its forms. When presented with questions or problems, an information literate person would know what information to look for, how to search efficiently and be able to access relevant sources. In addition, an information literate person would have the ability to evaluate and select appropriate information sources and to use the information effectively and ethically to answer questions or solve problems. Information literacy has become an important element in higher education. The information literacy movement has internationally recognized standards and learning outcomes. The step-by-step process of achieving information literacy is particularly crucial in an era where knowledge could be disseminated through a variety of media. What is the relationship between information literacy as we define it in higher education and information literacy among non-academic populations? What forces will change how we think about the definition of information literacy in the future and how we will apply the definition in all environments?Keywords: information literacy, human beings, visual media and computer network etc, information literacy
Procedia PDF Downloads 34015494 Effects of High-Intensity Interval Training versus Traditional Rehabilitation Exercises on Functional Outcomes in Patients with Knee Osteoarthritis: A Randomized Controlled Trial
Authors: Ahmed Torad
Abstract:
Background: Knee osteoarthritis (OA) is a prevalent musculoskeletal condition characterized by pain and functional impairment. While various rehabilitation approaches have been employed, the effectiveness of high-intensity interval training (HIIT) compared to traditional rehabilitation exercises remains unclear. Objective: This randomized controlled trial aimed to compare the effects of HIIT and traditional rehabilitation exercises on pain reduction, functional improvement, and quality of life in individuals with knee OA. Methods: A total of 120 participants diagnosed with knee OA were randomly allocated into two groups: the HIIT group (n=60) and the traditional rehabilitation group (n=60). The HIIT group participated in a 12-week supervised program consisting of high-intensity interval exercises, while the traditional rehabilitation group followed a conventional physiotherapy regimen. Outcome measures included visual analog scale (VAS) pain scores, Western Ontario and McMaster Universities Osteoarthritis Index (WOMAC), and the Short Form-36 Health Survey (SF-36) at baseline and after the intervention period. Results: Both groups showed significant improvements in pain scores, functional outcomes (WOMAC), and quality of life (SF-36) after 12 weeks of intervention. However, the HIIT group demonstrated superior pain reduction (p<0.001), functional improvement (p<0.001), and physical health-related quality of life (p=0.002) compared to the traditional rehabilitation group. No significant differences were observed in mental health-related quality of life between the two groups. Conclusion: High-intensity interval training appears to be a more effective rehabilitation approach than traditional exercises for individuals with knee osteoarthritis, resulting in greater pain reduction, improved function, and enhanced physical health-related quality of life. These findings suggest that HIIT may represent a promising intervention strategy for managing knee OA and enhancing the overall well-being of affected individuals.Keywords: knee osteoarthritis, high-intensity interval training, traditional rehabilitation exercises, randomized controlled trial, pain reduction, functional improvement, quality of life
Procedia PDF Downloads 7715493 Artificial Intelligence for Traffic Signal Control and Data Collection
Authors: Reggie Chandra
Abstract:
Trafficaccidents and traffic signal optimization are correlated. However, 70-90% of the traffic signals across the USA are not synchronized. The reason behind that is insufficient resources to create and implement timing plans. In this work, we will discuss the use of a breakthrough Artificial Intelligence (AI) technology to optimize traffic flow and collect 24/7/365 accurate traffic data using a vehicle detection system. We will discuss what are recent advances in Artificial Intelligence technology, how does AI work in vehicles, pedestrians, and bike data collection, creating timing plans, and what is the best workflow for that. Apart from that, this paper will showcase how Artificial Intelligence makes signal timing affordable. We will introduce a technology that uses Convolutional Neural Networks (CNN) and deep learning algorithms to detect, collect data, develop timing plans and deploy them in the field. Convolutional Neural Networks are a class of deep learning networks inspired by the biological processes in the visual cortex. A neural net is modeled after the human brain. It consists of millions of densely connected processing nodes. It is a form of machine learning where the neural net learns to recognize vehicles through training - which is called Deep Learning. The well-trained algorithm overcomes most of the issues faced by other detection methods and provides nearly 100% traffic data accuracy. Through this continuous learning-based method, we can constantly update traffic patterns, generate an unlimited number of timing plans and thus improve vehicle flow. Convolutional Neural Networks not only outperform other detection algorithms but also, in cases such as classifying objects into fine-grained categories, outperform humans. Safety is of primary importance to traffic professionals, but they don't have the studies or data to support their decisions. Currently, one-third of transportation agencies do not collect pedestrian and bike data. We will discuss how the use of Artificial Intelligence for data collection can help reduce pedestrian fatalities and enhance the safety of all vulnerable road users. Moreover, it provides traffic engineers with tools that allow them to unleash their potential, instead of dealing with constant complaints, a snapshot of limited handpicked data, dealing with multiple systems requiring additional work for adaptation. The methodologies used and proposed in the research contain a camera model identification method based on deep Convolutional Neural Networks. The proposed application was evaluated on our data sets acquired through a variety of daily real-world road conditions and compared with the performance of the commonly used methods requiring data collection by counting, evaluating, and adapting it, and running it through well-established algorithms, and then deploying it to the field. This work explores themes such as how technologies powered by Artificial Intelligence can benefit your community and how to translate the complex and often overwhelming benefits into a language accessible to elected officials, community leaders, and the public. Exploring such topics empowers citizens with insider knowledge about the potential of better traffic technology to save lives and improve communities. The synergies that Artificial Intelligence brings to traffic signal control and data collection are unsurpassed.Keywords: artificial intelligence, convolutional neural networks, data collection, signal control, traffic signal
Procedia PDF Downloads 17215492 Risk Assessment of Flood Defences by Utilising Condition Grade Based Probabilistic Approach
Authors: M. Bahari Mehrabani, Hua-Peng Chen
Abstract:
Management and maintenance of coastal defence structures during the expected life cycle have become a real challenge for decision makers and engineers. Accurate evaluation of the current condition and future performance of flood defence structures is essential for effective practical maintenance strategies on the basis of available field inspection data. Moreover, as coastal defence structures age, it becomes more challenging to implement maintenance and management plans to avoid structural failure. Therefore, condition inspection data are essential for assessing damage and forecasting deterioration of ageing flood defence structures in order to keep the structures in an acceptable condition. The inspection data for flood defence structures are often collected using discrete visual condition rating schemes. In order to evaluate future condition of the structure, a probabilistic deterioration model needs to be utilised. However, existing deterioration models may not provide a reliable prediction of performance deterioration for a long period due to uncertainties. To tackle the limitation, a time-dependent condition-based model associated with a transition probability needs to be developed on the basis of condition grade scheme for flood defences. This paper presents a probabilistic method for predicting future performance deterioration of coastal flood defence structures based on condition grading inspection data and deterioration curves estimated by expert judgement. In condition-based deterioration modelling, the main task is to estimate transition probability matrices. The deterioration process of the structure related to the transition states is modelled according to Markov chain process, and a reliability-based approach is used to estimate the probability of structural failure. Visual inspection data according to the United Kingdom Condition Assessment Manual are used to obtain the initial condition grade curve of the coastal flood defences. The initial curves then modified in order to develop transition probabilities through non-linear regression based optimisation algorithms. The Monte Carlo simulations are then used to evaluate the future performance of the structure on the basis of the estimated transition probabilities. Finally, a case study is given to demonstrate the applicability of the proposed method under no-maintenance and medium-maintenance scenarios. Results show that the proposed method can provide an effective predictive model for various situations in terms of available condition grading data. The proposed model also provides useful information on time-dependent probability of failure in coastal flood defences.Keywords: condition grading, flood defense, performance assessment, stochastic deterioration modelling
Procedia PDF Downloads 23515491 Spatial Deictics in Face-to-Face Communication: Findings in Baltic Languages
Authors: Gintare Judzentyte
Abstract:
The present research is aimed to discuss semantics and pragmatics of spatial deictics (deictic adverbs of place and demonstrative pronouns) in the Baltic languages: in spoken Lithuanian and in spoken Latvian. The following objectives have been identified to achieve the aim: 1) to determine the usage of adverbs of place in spoken Lithuanian and Latvian and to verify their meanings in face-to-face communication; 2) to determine the usage of demonstrative pronouns in spoken Lithuanian and Latvian and to verify their meanings in face-to-face communication; 3) to compare the systems between the two spoken languages and to identify the main tendencies. As meanings of demonstratives (adverbs of place and demonstrative pronouns) are context-bound, it is necessary to verify their usage in spontaneous interaction. Besides, deictic gestures play a very important role in face-to-face communication. Therefore, an experimental method is necessary to collect the data. Video material representing spoken Lithuanian and spoken Latvian was recorded by means of the method of a qualitative interview (a semi-structured interview: an empirical research is all about asking right questions). The collected material was transcribed and evaluated taking into account several approaches: 1) physical distance (location of the referent, visual accessibility of the referent); 2) deictic gestures (the combination of language and gesture is especially characteristic of the exophoric use); 3) representation of mental spaces in physical space (a speaker sometimes wishes to mark something that is psychically close as psychologically distant and vice versa). The research of the collected data revealed that in face-to-face communication the participants choose deictic adverbs of place instead of demonstrative pronouns to locate/identify entities in situations where the demonstrative pronouns would be expected in spoken Lithuanian and in spoken Latvian. The analysis showed that visual accessibility of the referent is very important in face-to-face communication, but the main criterion while localizing objects and entities is the need for contrast: lith. čia ‘here’, šis ‘this’, latv. šeit ‘here’, šis ‘this’ usually identify distant entities and are used instead of distal demonstratives (lith. ten ‘there’, tas ‘that’, latv. tur ‘there’, tas ‘that’), because the referred objects/subjects contrast to further entities. Furthermore, the interlocutors in examples from a spontaneously situated interaction usually extend their space and can refer to a ‘distal’ object/subject with a ‘proximal’ demonstrative based on the psychological choice. As the research of the spoken Baltic languages confirmed, the choice of spatial deictics in face-to-face communication is strongly effected by a complex of criteria. Although there are some main tendencies, the exact meaning of spatial deictics in the spoken Baltic languages is revealed and is relevant only in a certain context.Keywords: Baltic languages, face-to-face communication, pragmatics, semantics, spatial deictics
Procedia PDF Downloads 29015490 From Myth to Screen: A Cultural Criticism of the Adaptation of Nordic Mythology in Marvel Cinematic Universe’s Thor Trilogy
Authors: Vathya Anindita Putri, Henny Saptatia Drajati Nugrahani
Abstract:
This research aims to explore the representation of Nordic mythology in the commercial film titled “Thor” produced by the Marvel Cinematic Universe. First, the Nordic mythology adaptation and representation in “Thor” compared to other media. Second, the importance of using the mise en scene technique, the comprehensive portrayal of Nordic mythology and the audience's experiences in enjoying the film. This research is conducted using qualitative methods. The two research questions are analyzed using three theories: Adaptation theory by Robert Stam, Mise en Scene theory by Jean-Luc Godard, and Cultural Criticism theory by Michel Foucault. Robert Stam emphasizes the importance of social and historical in understanding film adaptations. Film adaptations always occur in a specific cultural and historical context; therefore, authors and producers must consider these factors when creating a successful adaptation. Jean-Luc Godard uses the “politiques des auteurs” approach to understand that films are not just cultural products made for entertainment, but they are works of art by authors and directors. It is important to explore how authors and directors convey their ideas and emotions in their films, in this case, a film set in Nordic mythology. Foucault takes an approach to analyzing power that considers how power operates and influences social relationships in a specific context. Foucault’s theory is used to analyze how the representation of Nordic mythology is used as an instrument of power by the Marvel Cinematic Universe to influence how the audience views Nordic mythology. The initial findings of this research are that the fusion of Nordic mythology with modern superhero storytelling in the film “Thor” produced by Marvel, is successful. The film contains conflicts in the modern world and represents the symbolism of Nordic mythology. The rich and interesting atmosphere of Nordic mythology is presented through epic battle scenes, captivating character roles, and the use of visual effects that make the film more vivid and real.Keywords: adaptation theory, cultural criticism theory, film criticism, Marvel cinematic universe, Mise en Scene theory, Nordic mythology
Procedia PDF Downloads 8715489 Estimation and Comparison of Delay at Signalized Intersections Based on Existing Methods
Authors: Arpita Saha, Satish Chandra, Indrajit Ghosh
Abstract:
Delay implicates the time loss of a traveler while crossing an intersection. Efficiency of traffic operation at signalized intersections is assessed in terms of delay caused to an individual vehicle. Highway Capacity Manual (HCM) method and Webster’s method are the most widely used in India for delay estimation purpose. However, in India, traffic is highly heterogeneous in nature with extremely poor lane discipline. Therefore, to explore best delay estimation technique for Indian condition, a comparison was made. In this study, seven signalized intersections from three different cities where chosen. Data was collected for both during morning and evening peak hours. Only under saturated cycles were considered for this study. Delay was estimated based on the field data. With the help of Simpson’s 1/3 rd rule, delay of under saturated cycles was estimated by measuring the area under the curve of queue length and cycle time. Moreover, the field observed delay was compared with the delay estimated using HCM, Webster, Probabilistic, Taylor’s expansion and Regression methods. The drawbacks of the existing delay estimation methods to be use in Indian heterogeneous traffic conditions were figured out, and best method was proposed. It was observed that direct estimation of delay using field measured data is more accurate than existing conventional and modified methods.Keywords: delay estimation technique, field delay, heterogeneous traffic, signalised intersection
Procedia PDF Downloads 30315488 Enhancement of Mulberry Leaf Yield and Water Productivity in Eastern Dry Zone of Karnataka, India
Authors: Narayanappa Devakumar, Chengalappa Seenappa
Abstract:
The field experiments were conducted during Rabi 2013 and summer 2014 at College of Sericulture, Chintamani, Chickaballapur district, Karnataka, India to find out the response of mulberry to different methods, levels of irrigation and mulching. The results showed that leaf yield and water productivity of mulberry were significantly influenced by different methods, levels of irrigation and mulching. Subsurface drip with lower level of irrigation at 0.8 CPE (Cumulative Pan Evaporation) recorded higher leaf yield and water productivity (42857 kg ha-1 yr-1and 364.41 kg hacm-1) than surface drip with higher level of irrigation at 1.0 CPE (38809 kg ha-1 yr-1 and 264.10 kg hacm-1) and micro spray jet (39931 kg ha-1 yr-1 and 271.83 kg hacm-1). Further, subsurface drip recorded minimum water used to produce one kg of leaf and to earn one rupee of profit (283 L and 113 L) compared to surface drip (390 L and 156 L) and micro spray jet (379 L and 152 L) irrigation methods. Mulberry leaf yield increased and water productivity decreased with increased levels of irrigation. However, these results indicated that irrigation of mulberry with subsurface drip increased leaf yield and water productivity by saving 20% of irrigation water than surface drip and micro spray jet irrigation methods in Eastern Dry Zone (EDZ) of Karnataka.Keywords: cumulative pan evaporation, mulaberry, subsurface drip irrigation, water productivity
Procedia PDF Downloads 28115487 A Visualization Classification Method for Identifying the Decayed Citrus Fruit Infected by Fungi Based on Hyperspectral Imaging
Authors: Jiangbo Li, Wenqian Huang
Abstract:
Early detection of fungal infection in citrus fruit is one of the major problems in the postharvest commercialization process. The automatic and nondestructive detection of infected fruits is still a challenge for the citrus industry. At present, the visual inspection of rotten citrus fruits is commonly performed by workers through the ultraviolet induction fluorescence technology or manual sorting in citrus packinghouses to remove fruit subject with fungal infection. However, the former entails a number of problems because exposing people to this kind of lighting is potentially hazardous to human health, and the latter is very inefficient. Orange is used as a research object. This study would focus on this problem and proposed an effective method based on Vis-NIR hyperspectral imaging in the wavelength range of 400-1000 nm with a spectroscopic resolution of 2.8 nm. In this work, three normalization approaches are applied prior to analysis to reduce the effect of sample curvature on spectral profiles, and it is found that mean normalization was the most effective pretreatment for decreasing spectral variability due to curvature. Then, principal component analysis (PCA) was applied to a dataset composing of average spectra from decayed and normal tissue to reduce the dimensionality of data and observe the ability of Vis-NIR hyper-spectra to discriminate data from two classes. In this case, it was observed that normal and decayed spectra were separable along the resultant first principal component (PC1) axis. Subsequently, five wavelengths (band) centered at 577, 702, 751, 808, and 923 nm were selected as the characteristic wavelengths by analyzing the loadings of PC1. A multispectral combination image was generated based on five selected characteristic wavelength images. Based on the obtained multispectral combination image, the intensity slicing pseudocolor image processing method is used to generate a 2-D visual classification image that would enhance the contrast between normal and decayed tissue. Finally, an image segmentation algorithm for detection of decayed fruit was developed based on the pseudocolor image coupled with a simple thresholding method. For the investigated 238 independent set samples including infected fruits infected by Penicillium digitatum and normal fruits, the total success rate is 100% and 97.5%, respectively, and, the proposed algorithm also used to identify the orange infected by penicillium italicum with a 100% identification accuracy, indicating that the proposed multispectral algorithm here is an effective method and it is potential to be applied in citrus industry.Keywords: citrus fruit, early rotten, fungal infection, hyperspectral imaging
Procedia PDF Downloads 30415486 Defining the Customers' Color Preference for the Apparel Industry in Terms of Chromaticity Coordinates
Authors: Banu Hatice Gürcüm, Pınar Arslan, Mahmut Yalçın
Abstract:
Fashion designers create lots of dresses, suits, shoes, and other clothing and accessories, which are purchased every year by consumers. Fashion trends, sketches of designs, accessories affect the apparel goods, but colors make the finishing touches to an outfit. In all fields of apparel men's, women's, and children's wear, including casual wear, suits, sportswear, formal wear, outerwear, maternity, and intimate apparel, color sells. Thus, specialization in color in apparel is a basic concern each season. The perception of color is the key to sales for every sector in textile business. Mechanism of color perception, cognition in brain and color emotion are unique subjects, which scientists have been investigating for many years. The parameters of color may not be corresponding to visual scales since human emotions induced by color are completely subjective. However, with a very few exception each manufacturer concern their top selling colors for each season through seasonal sales reports of apparel companies. This paper examines sensory and instrumental methods for quantifying color of fabrics and investigates the relationship between fabric color and sale numbers. 5 top selling colors for each season from 10 leading apparel companies in the same segment are taken. The compilation is based according to the sales of the companies for 5 to 10 years. The research’s main concern is the corelation with the magnitude of seasonal color selling figures and the CIE chromaticity coordinates. The colors are chosen from the globally accepted Pantone Textile Color System and the three-dimentional measurement system CIE L*a*b* (CIELAB) is used, L* representing the degree of lightness of color, a* the degree of color ranging from magenta to green, and b* the degree of color ranging from blue to yellow. The objective of this paper is to demonstrate the feasibility of relating color perceptance to a laboratory instrument yielding measurements in the CIELAB system. Our approach is to obtain a total of a hundred reference fabrics to be measured on a laboratory spectrophotometer calibrated to the CIELAB color system. Relationships between the CIE tristimulus (X, Y, Z) and CIELAB (L*, a*, b*) are examined and are reported herein.Keywords: CIELAB, CIE tristimulus, color preference, fashion
Procedia PDF Downloads 33615485 Video-Based System for Support of Robot-Enhanced Gait Rehabilitation of Stroke Patients
Authors: Matjaž Divjak, Simon Zelič, Aleš Holobar
Abstract:
We present a dedicated video-based monitoring system for quantification of patient’s attention to visual feedback during robot assisted gait rehabilitation. Two different approaches for eye gaze and head pose tracking are tested and compared. Several metrics for assessment of patient’s attention are also presented. Experimental results with healthy volunteers demonstrate that unobtrusive video-based gaze tracking during the robot-assisted gait rehabilitation is possible and is sufficiently robust for quantification of patient’s attention and assessment of compliance with the rehabilitation therapy.Keywords: video-based attention monitoring, gaze estimation, stroke rehabilitation, user compliance
Procedia PDF Downloads 42615484 Philippine Site Suitability Analysis for Biomass, Hydro, Solar, and Wind Renewable Energy Development Using Geographic Information System Tools
Authors: Jara Kaye S. Villanueva, M. Rosario Concepcion O. Ang
Abstract:
For the past few years, Philippines has depended most of its energy source on oil, coal, and fossil fuel. According to the Department of Energy (DOE), the dominance of coal in the energy mix will continue until the year 2020. The expanding energy needs in the country have led to increasing efforts to promote and develop renewable energy. This research is a part of the government initiative in preparation for renewable energy development and expansion in the country. The Philippine Renewable Energy Resource Mapping from Light Detection and Ranging (LiDAR) Surveys is a three-year government project which aims to assess and quantify the renewable energy potential of the country and to put them into usable maps. This study focuses on the site suitability analysis of the four renewable energy sources – biomass (coconut, corn, rice, and sugarcane), hydro, solar, and wind energy. The site assessment is a key component in determining and assessing the most suitable locations for the construction of renewable energy power plants. This method maximizes the use of both the technical methods in resource assessment, as well as taking into account the environmental, social, and accessibility aspect in identifying potential sites by utilizing and integrating two different methods: the Multi-Criteria Decision Analysis (MCDA) method and Geographic Information System (GIS) tools. For the MCDA, Analytical Hierarchy Processing (AHP) is employed to determine the parameters needed for the suitability analysis. To structure these site suitability parameters, various experts from different fields were consulted – scientists, policy makers, environmentalists, and industrialists. The need to have a well-represented group of people to consult with is relevant to avoid bias in the output parameter of hierarchy levels and weight matrices. AHP pairwise matrix computation is utilized to derive weights per level out of the expert’s gathered feedback. Whereas from the threshold values derived from related literature, international studies, and government laws, the output values were then consulted with energy specialists from the DOE. Geospatial analysis using GIS tools translate this decision support outputs into visual maps. Particularly, this study uses Euclidean distance to compute for the distance values of each parameter, Fuzzy Membership algorithm which normalizes the output from the Euclidean Distance, and the Weighted Overlay tool for the aggregation of the layers. Using the Natural Breaks algorithm, the suitability ratings of each of the map are classified into 5 discrete categories of suitability index: (1) not suitable (2) least suitable, (3) suitable, (4) moderately suitable, and (5) highly suitable. In this method, the classes are grouped based on the best groups similar values wherein each subdivision are set from the rest based on the big difference in boundary values. Results show that in the entire Philippine area of responsibility, biomass has the highest suitability rating with rice as the most suitable at 75.76% suitability percentage, whereas wind has the least suitability percentage with score 10.28%. Solar and Hydro fall in the middle of the two, with suitability values 28.77% and 21.27%.Keywords: site suitability, biomass energy, hydro energy, solar energy, wind energy, GIS
Procedia PDF Downloads 15215483 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication
Procedia PDF Downloads 29215482 Application of Semantic Technologies in Rapid Reconfiguration of Factory Systems
Authors: J. Zhang, K. Agyapong-Kodua
Abstract:
Digital factory based on visual design and simulation has emerged as a mainstream to reduce digital development life cycle. Some basic industrial systems are being integrated via semantic modelling, and products (P) matching process (P)-resource (R) requirements are designed to fulfill current customer demands. Nevertheless, product design is still limited to fixed product models and known knowledge of product engineers. Therefore, this paper presents a rapid reconfiguration method based on semantic technologies with PPR ontologies to reuse known and unknown knowledge. In order to avoid the influence of big data, our system uses a cloud manufactory and distributed database to improve the efficiency of querying meeting PPR requirements.Keywords: semantic technologies, factory system, digital factory, cloud manufactory
Procedia PDF Downloads 48815481 BSYJ Promoting Homing and Differentiation of Mesenchymal Stem Cells at the Retina of Age-Related Macular Degeneration Model Mice Induced by Sodium Iodate
Authors: Lina Liang, Kai Xu, Jing Zhang
Abstract:
Purpose: Age-related macular degeneration (AMD) is a major leading cause of visual impairment and blindness with no cure currently established. Cell replacement is discussed as a potential therapy for AMD. Besides intravitreal injection and subretinal injection, intravenous administration has been explored as an alternative route. This study is to observe the effect of BSYJ, a traditional Chinese medicine on the homing and differentiation of mesenchymal stem cells transplanted via tail vein injection in an age-related macular degeneration mouse model. Methods: Four-week-old C57BL/6J mice were injected with 40 mg/kg NaIO₃ to induce age-related macular degeneration model. At the second day after NaIO₃ injection, 1×10⁷ GFP labeled bone marrow-derived mesenchymal stem cells (GFP-MSCs) were transplanted via tali vein injection into the experimental mice. Then the mice were randomly divided into two groups, gavaged with either BSYJ solution (BSYJ group, n=12) or distilled water (DW group, n=12). 12 age-matched healthy C57BL/6J mice were fed regularly as normal control. At day 7, day 14, and day 28 after treatment, retina flat mounting was used to detect the homing of mesenchymal stem cells at the retina. Double-labeling immunofluorescence was used to determine the differentiation of mesenchymal stem cells. Results: At 7, 14, 28 days after treatment, the numbers of GFP-MSCs detected by retina flatmount were 10.2 ± 2.5, 14.5 ± 3.4 and 18.7 ± 5.8, respectively in the distilled water group, while 15.7 ± 3.8, 32.3 ± 3.5 and 77.3 ± 6.4 in BSYJ group, the differences between the two groups were significant (p < 0.05). At 28 days after treatment, it was shown by double staining immunofluorescence that there were more GFP positive cells in the retina of BSYJ group than that of the DW group, but none of the cells expressed RPE specific genes such as RPE65 and CRALBP, or photoreceptor genes such as recoverin and rhodopsin either in BSYJ group or DW group. However, GFAP positive cells were found among the cells labeled with GFP, and the double labeling cells were much more in the BSYJ group than the distilled water group. Conclusion: BSYJ could promote homing of mesenchymal stem cells at the retina of age-related macular degeneration model mice induced by NaIO₃, and the differentiation towards to glial cells. Acknowledgement: National Natural Foundation of China (No: 81473736, 81674033,81973912).Keywords: BSYJ, differentiation, homing, mesenchymal stem cells
Procedia PDF Downloads 14715480 Performance Evaluation of Fingerprint, Auto-Pin and Password-Based Security Systems in Cloud Computing Environment
Authors: Emmanuel Ogala
Abstract:
Cloud computing has been envisioned as the next-generation architecture of Information Technology (IT) enterprise. In contrast to traditional solutions where IT services are under physical, logical and personnel controls, cloud computing moves the application software and databases to the large data centres, where the management of the data and services may not be fully trustworthy. This is due to the fact that the systems are opened to the whole world and as people tries to have access into the system, many people also are there trying day-in day-out on having unauthorized access into the system. This research contributes to the improvement of cloud computing security for better operation. The work is motivated by two problems: first, the observed easy access to cloud computing resources and complexity of attacks to vital cloud computing data system NIC requires that dynamic security mechanism evolves to stay capable of preventing illegitimate access. Second; lack of good methodology for performance test and evaluation of biometric security algorithms for securing records in cloud computing environment. The aim of this research was to evaluate the performance of an integrated security system (ISS) for securing exams records in cloud computing environment. In this research, we designed and implemented an ISS consisting of three security mechanisms of biometric (fingerprint), auto-PIN and password into one stream of access control and used for securing examination records in Kogi State University, Anyigba. Conclusively, the system we built has been able to overcome guessing abilities of hackers who guesses people password or pin. We are certain about this because the added security system (fingerprint) needs the presence of the user of the software before a login access can be granted. This is based on the placement of his finger on the fingerprint biometrics scanner for capturing and verification purpose for user’s authenticity confirmation. The study adopted the conceptual of quantitative design. Object oriented and design methodology was adopted. In the analysis and design, PHP, HTML5, CSS, Visual Studio Java Script, and web 2.0 technologies were used to implement the model of ISS for cloud computing environment. Note; PHP, HTML5, CSS were used in conjunction with visual Studio front end engine design tools and MySQL + Access 7.0 were used for the backend engine and Java Script was used for object arrangement and also validation of user input for security check. Finally, the performance of the developed framework was evaluated by comparing with two other existing security systems (Auto-PIN and password) within the school and the results showed that the developed approach (fingerprint) allows overcoming the two main weaknesses of the existing systems and will work perfectly well if fully implemented.Keywords: performance evaluation, fingerprint, auto-pin, password-based, security systems, cloud computing environment
Procedia PDF Downloads 14115479 Development of a Quick On-Site Pass/Fail Test for the Evaluation of Fresh Concrete Destined for Application as Exposed Concrete
Authors: Laura Kupers, Julie Piérard, Niki Cauberg
Abstract:
The use of exposed concrete (sometimes referred to as architectural concrete), keeps gaining popularity. Exposed concrete has the advantage to combine the structural properties of concrete with an aesthetic finish. However, for a successful aesthetic finish, much attention needs to be paid to the execution (formwork, release agent, curing, weather conditions…), the concrete composition (choice of the raw materials and mix proportions) as well as to its fresh properties. For the latter, a simple on-site pass/fail test could halt the casting of concrete not suitable for architectural concrete and thus avoid expensive repairs later. When architects opt for an exposed concrete, they usually want a smooth, uniform and nearly blemish-free surface. For this choice, a standard ‘construction’ concrete does not suffice. An aesthetic surface finishing requires the concrete to contain a minimum content of fines to minimize the risk of segregation and to allow complete filling of more complex shaped formworks. The concrete may neither be too viscous as this makes it more difficult to compact and it increases the risk of blow holes blemishing the surface. On the other hand, too much bleeding may cause color differences on the concrete surface. An easy pass/fail test, which can be performed on the site just before the casting, could avoid these problems. In case the fresh concrete fails the test, the concrete can be rejected. Only in case the fresh concrete passes the test, the concrete would be cast. The pass/fail tests are intended for a concrete with a consistency class S4. Five tests were selected as possible onsite pass/fail test. Two of these tests already exist: the K-slump test (ASTM C1362) and the Bauer Filter Press Test. The remaining three tests were developed by the BBRI in order to test the segregation resistance of fresh concrete on site: the ‘dynamic sieve stability test’, the ‘inverted cone test’ and an adapted ‘visual stability index’ (VSI) for the slump and flow test. These tests were inspired by existing tests for self-compacting concrete, for which the segregation resistance is of great importance. The suitability of the fresh concrete mixtures was also tested by means of a laboratory reference test (resistance to segregation) and by visual inspection (blow holes, structure…) of small test walls. More than fifteen concrete mixtures of different quality were tested. The results of the pass/fail tests were compared with the results of this laboratory reference test and the test walls. The preliminary laboratory results indicate that concrete mixtures ‘suitable’ for placing as exposed concrete (containing sufficient fines, a balanced grading curve etc.) can be distinguished from ‘inferior’ concrete mixtures. Additional laboratory tests, as well as tests on site, will be conducted to confirm these preliminary results and to set appropriate pass/fail values.Keywords: exposed concrete, testing fresh concrete, segregation resistance, bleeding, consistency
Procedia PDF Downloads 42515478 Strengthening Service Delivery to Improving Cervical Cancer Screening in Southwestern Nigeria: A Pilot Project
Authors: Afolabi K. Esther, Kuye Tolulope, Babafemi, L. Olayemi, Omikunle Yemisi
Abstract:
Background: Cervical cancer is a potentially preventable disease of public significance. All sexually active women are at risk of cervical cancer; however, the uptake and coverage are low in low-middle resource countries. Hence, the programme explored the feasibility of demonstrating an innovative and low-cost system approach to cervical cancer screening service delivery among reproductive-aged women in low–resource settings in Southwestern Nigeria. This was to promote the uptake and quality improvement of cervical cancer screening services. Methods: This study was an intervention project in three senatorial districts in Osun State that have primary, secondary and tertiary health facilities. The project was in three phases; Pre-intervention, Intervention, and Post-intervention. The study utilised the existing infrastructure, facilities and staff in project settings. The study population was nurse-midwives, community health workers and reproductive-aged women (30-49 years). The intervention phase entailed using innovative, culturally appropriate strategies to create awareness of cervical cancer and preventive health-seeking behaviour among women in the reproductive-aged group (30-49) years. Also, the service providers (community health workers, Nurses, and Midwives) were trained on screening methods and treatment of pre-cancerous lesions, and there was the provision of essential equipment and supplies for cervical cancer screening services at health facilities. Besides, advocacy and engagement were made with relevant stakeholders to integrate the cervical cancer screening services into related reproductive health services and greater allocation of resources. The expected results compared the pre and post-intervention using the baseline and process indicators and the effect of the intervention phase on screening coverage using a plausibility assessment design. The project lasted 12 months; visual Inspection with Acetic acid (VIA) screening for the women for six months and follow-up in 6 months for women receiving treatment. Results: The pre-intervention phase assessed baseline service delivery statistics in the previous 12 months drawn from the retrospective data collected as part of the routine monitoring and reporting systems. The uptake of cervical cancer screening services was low as the number of women screened in the previous 12 months was 156. Service personnel's competency level was fair (54%), and limited availability of essential equipment and supplies for cervical cancer screening services. At the post-intervention phase, the level of uptake had increased as the number of women screened was 1586 within six months in the study settings. This showed about a 100-%increase in the uptake of cervical cancer screening services compared with the baseline assessment. Also, the post-intervention level of competency of service delivery personnel had increased to 86.3%, which indicates quality improvement of the cervical cancer screening service delivery. Conclusion: the findings from the study have shown an effective approach to strengthening and improving cervical cancer screening service delivery in Southwestern Nigeria. Hence, the intervention promoted a positive attitude and health-seeking behaviour among the target population, significantly influencing the uptake of cervical cancer screening services.Keywords: cervical cancer, screening, nigeria, health system strengthening
Procedia PDF Downloads 10615477 Finite Element and Split Bregman Methods for Solving a Family of Optimal Control Problem with Partial Differential Equation Constraint
Authors: Mahmoud Lot
Abstract:
In this article, we will discuss the solution of elliptic optimal control problem. First, by using the nite element method, we obtain the discrete form of the problem. The obtained discrete problem is actually a large scale constrained optimization problem. Solving this optimization problem with traditional methods is difficult and requires a lot of CPU time and memory. But split Bergman method converts the constrained problem to an unconstrained, and hence it saves time and memory requirement. Then we use the split Bregman method for solving this problem, and examples show the speed and accuracy of split Bregman methods for solving these types of problems. We also use the SQP method for solving the examples and compare with the split Bregman method.Keywords: Split Bregman Method, optimal control with elliptic partial differential equation constraint, finite element method
Procedia PDF Downloads 15215476 Understanding Tacit Knowledge and Its Role in Military Organizations: Methods of Managing Tacit Knowledge
Authors: M. Erhan Orhan, Onur Ozdemir
Abstract:
Expansion of area of operation and increasing diversity of threats forced the military organizations to change in many ways. However, tacit knowledge still is the most fundamental component of organizational knowledge. Since it is human oriented and in warfare human stands at the core of the organization. Therefore, military organizations should find effective ways of systematically utilizing tacit knowledge. In this context, this article suggest some methods for turning tacit knowledge into explicit in military organizations.Keywords: tacit knowledge, military, knowledge management, warfare, technology
Procedia PDF Downloads 48815475 Studying Second Language Development from a Complex Dynamic Systems Perspective
Authors: L. Freeborn
Abstract:
This paper discusses the application of complex dynamic system theory (DST) to the study of individual differences in second language development. This transdisciplinary framework allows researchers to view the trajectory of language development as a dynamic, non-linear process. A DST approach views language as multi-componential, consisting of multiple complex systems and nested layers. These multiple components and systems continuously interact and influence each other at both the macro- and micro-level. Dynamic systems theory aims to explain and describe the development of the language system, rather than make predictions about its trajectory. Such a holistic and ecological approach to second language development allows researchers to include various research methods from neurological, cognitive, and social perspectives. A DST perspective would involve in-depth analyses as well as mixed methods research. To illustrate, a neurobiological approach to second language development could include non-invasive neuroimaging techniques such as electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) to investigate areas of brain activation during language-related tasks. A cognitive framework would further include behavioural research methods to assess the influence of intelligence and personality traits, as well as individual differences in foreign language aptitude, such as phonetic coding ability and working memory capacity. Exploring second language development from a DST approach would also benefit from including perspectives from the field of applied linguistics, regarding the teaching context, second language input, and the role of affective factors such as motivation. In this way, applying mixed research methods from neurobiological, cognitive, and social approaches would enable researchers to have a more holistic view of the dynamic and complex processes of second language development.Keywords: dynamic systems theory, mixed methods, research design, second language development
Procedia PDF Downloads 13615474 A Comparative Study on Creep Modeling in Composites
Authors: Roham Rafiee, Behzad Mazhari
Abstract:
Composite structures, having incredible properties, have gained considerable popularity in the last few decades. Among all types, polymer matrix composites are being used extensively due to their unique characteristics including low weight, convenient fabrication process and low cost. Having polymer as matrix, these type of composites show different creep behavior when compared to metals and even other types of composites since most polymers undergo creep even in room temperature. One of the most challenging topics in creep is to introduce new techniques for predicting long term creep behavior of materials. Depending on the material which is being studied the appropriate method would be different. Methods already proposed for predicting long term creep behavior of polymer matrix composites can be divided into five categories: (1) Analytical Modeling, (2) Empirical Modeling, (3) Superposition Based Modeling (Semi-empirical), (4) Rheological Modeling, (5) Finite Element Modeling. Each of these methods has individual characteristics. Studies have shown that none of the mentioned methods can predict long term creep behavior of all PMC composites in all circumstances (loading, temperature, etc.) but each of them has its own priority in different situations. The reason to this issue can be found in theoretical basis of these methods. In this study after a brief review over the background theory of each method, they are compared in terms of their applicability in predicting long-term behavior of composite structures. Finally, the explained materials are observed through some experimental studies executed by other researchers.Keywords: creep, comparative study, modeling, composite materials
Procedia PDF Downloads 44315473 Examination Scheduling System with Proposed Algorithm
Authors: Tabrej Khan
Abstract:
Examination Scheduling System (ESS) is a scheduling system that targets as an exam committee in any academic institute to help them in managing the exams automatically. We present an algorithm for Examination Scheduling System. Nowadays, many universities have challenges with creating examination schedule fast with less confliction compared to hand works. Our aims are to develop a computerized system that can be used in examination scheduling in an academic institute versus available resources (Time, Hall, Invigilator and instructor) with no contradiction and achieve fairness among students. ESS was developed using HTML, C# language, Crystal Report and ASP.NET through Microsoft Visual Studio 2010 as developing tools with integrated SQL server database. This application can produce some benefits such as reducing the time spent in creating an exam schedule and achieving fairness among studentsKeywords: examination scheduling system (ESS), algorithm, ASP.NET, crystal report
Procedia PDF Downloads 40515472 Solid Waste Management through Mushroom Cultivation: An Eco Friendly Approach
Authors: Mary Josephine
Abstract:
Waste of certain process can be the input source of other sectors in order to reduce environmental pollution. Today there are more and more solid wastes are generated, but only very small amount of those are recycled. So, the threatening of environmental pressure to public health is very serious. The methods considered for the treatment of solid waste are biogas tanks or processing to make animal feed and fertilizer, however, they did not perform well. An alternative approach is growing mushrooms on waste residues. This is regarded as an environmental friendly solution with potential economic benefit. The substrate producers do their best to produce quality substrate at low cost. Apart from other methods, this can be achieved by employing biologically degradable wastes used as the resource material component of the substrate. Mushroom growing is a significant tool for the restoration, replenishment and remediation of Earth’s overburdened ecosphere. One of the rational methods of waste utilization involves locally available wastes. The present study aims to find out the yield of mushroom grown on locally available waste for free and to conserve our environment by recycling wastes.Keywords: biodegradable, environment, mushroom, remediation
Procedia PDF Downloads 398