Search results for: hard and soft constraint
1827 Death Due to Ulnar Artery Injury by Glassdoor: A Case Report
Authors: Ashok Kumar Rastogi
Abstract:
Glass is a material commonly used for Glassdoor, glass bottles, cookware, and containers. It can be harmful, as it is a hard and blunt object. Glass has been associated with severe injury and is a common cause of injuries warranting hospital visits to the emergency department (ED). These injuries can be accidental or intentionally inflicted. Broken glass injuries can be severe, even deadly. If broken glass shards fall out on your arm, it may cause fatal injuries. Case history: A 20-year-old male dead body was found aside the road, police informed, and a video recording ceased during an investigation. In the video recording, the person was in a drunken state (unable to walk and disoriented), wandering in the residential area road. He saw a barber shop, the shop door made of Glass. Suddenly, he hit the Glassdoor with his right hand forcefully. The Glassdoor broke into multiple pieces, and multiple injuries were seen over the right hand. Observations: Multiple small and large lacerations were seen over the right anterior part of the elbow. The main injury looked like an incised wound caused by a hard and sharp object. The main injury was noted as a laceration of size 13 x 06 cm bone deep, placed obliquely over the anteromedial aspect of the right elbow joint, its medial end at medial end of elbow joint while its anterior end was 04 cm below the elbow joint with laceration of underline brachialis muscles and complete transaction of ulnar artery and vein, skin margins looking sharply cut with irregular margins with tiny cuts at the medial lower border of laceration. Injuries were antemortem and fresh in nature, caused by hard and blunt objects but looking like hard and sharp objects. All organs were found pale, and the cause of death was shock and hemorrhage because of ulnar vessel injury. Conclusion: The findings of this case report highlight the potentially lethal consequences of glass injuries, especially those involving Glassdoors. The study underscores the importance of accurate interpretation and identification of wounds caused by Glass, as they may resemble injuries caused by other objects. It emphasizes the challenges faced by autopsy surgeons when determining the cause and manner of death in cases where visual evidence of injury is absent or when the weapon is not recovered. Ultimately, this case report serves as a reminder of the potential dangers posed by Glass and the importance of comprehensive forensic examinations.Keywords: glassdoor, incised, wound, laceration, autopsy
Procedia PDF Downloads 761826 Fabrication of Antimicrobial Dental Model Using Digital Light Processing (DLP) Integrated with 3D-Bioprinting Technology
Authors: Rana Mohamed, Ahmed E. Gomaa, Gehan Safwat, Ayman Diab
Abstract:
Background: Bio-fabrication is a multidisciplinary research field that combines several principles, fabrication techniques, and protocols from different fields. The open-source-software movement is a movement that supports the use of open-source licenses for some or all software as part of the broader notion of open collaboration. Additive manufacturing is the concept of 3D printing, where it is a manufacturing method through adding layer-by-layer using computer-aided designs (CAD). There are several types of AM system used, and they can be categorized by the type of process used. One of these AM technologies is Digital light processing (DLP) which is a 3D printing technology used to rapidly cure a photopolymer resin to create hard scaffolds. DLP uses a projected light source to cure (Harden or crosslinking) the entire layer at once. Current applications of DLP are focused on dental and medical applications. Other developments have been made in this field, leading to the revolutionary field 3D bioprinting. The open-source movement was started to spread the concept of open-source software to provide software or hardware that is cheaper, reliable, and has better quality. Objective: Modification of desktop 3D printer into 3D bio-printer and the integration of DLP technology and bio-fabrication to produce an antibacterial dental model. Method: Modification of a desktop 3D printer into a 3D bioprinter. Gelatin hydrogel and sodium alginate hydrogel were prepared with different concentrations. Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum were extracted, and extractions were selected on different levels (Powder, aqueous extracts, total oils, and Essential oils) prepared for antibacterial bioactivity. Agar well diffusion method along with the E. coli have been used to perform the sensitivity test for the antibacterial activity of the extracts acquired by Zingiber officinale, Syzygium aromaticum, and Allium sativum. Lastly, DLP printing was performed to produce several dental models with the natural extracted combined with hydrogel to represent and simulate the Hard and Soft tissues. Result: The desktop 3D printer was modified into 3D bioprinter using open-source software Marline and modified custom-made 3D printed parts. Sodium alginate hydrogel and gelatin hydrogel were prepared at 5% (w/v), 10% (w/v), and 15%(w/v). Resin integration with the natural extracts of Rhizome of Zingiber officinale, Flower buds of Syzygium aromaticum, and Bulbs of Allium sativum was done following the percentage 1- 3% for each extract. Finally, the Antimicrobial dental model was printed; exhibits the antimicrobial activity, followed by merging with sodium alginate hydrogel. Conclusion: The open-source movement was successful in modifying and producing a low-cost Desktop 3D Bioprinter showing the potential of further enhancement in such scope. Additionally, the potential of integrating the DLP technology with bioprinting is a promising step toward the usage of the antimicrobial activity using natural products.Keywords: 3D printing, 3D bio-printing, DLP, hydrogel, antibacterial activity, zingiber officinale, syzygium aromaticum, allium sativum, panax ginseng, dental applications
Procedia PDF Downloads 941825 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics
Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood
Abstract:
We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka
Procedia PDF Downloads 3931824 A Radiographic Survey of Eggshell Powder Effect on Tibial Bone Defect Repair Tested in Dog
Authors: M. Yadegari, M. Nourbakhsh, N. Arbabzadeh
Abstract:
The skeletal system injuries are of major importance. In addition, it is recommended to use materials for hard tissue repair in open or closed fractures. It is important to use complex minerals with a beneficial effect on hard tissue repair, stimulating cell growth in the bone. Materials that could help avoid bone fracture inflammatory reaction and speed up bone fracture repair are of utmost importance in the treatment of bone fractures. Similar to minerals, the inner eggshell membrane consists of carbohydrates, lipids, proteins with the high pH, high calcium absorptive capacity and with faster bone fracture repair ability. In the present radiographic survey, eggshell-derived bone graft substitutes were used for bone defect repair in 8 dog tibia, measuring bone density on the day of implant placement and 30 and 60 days after placement. In fact, the result of this study shows the difference in bone growth and misshapen bones between treatment and control sites. Cell growth was adequate in treatment sites and misshapen bones were less frequent here than in control sites.Keywords: bone repair, eggshell powder, implant, radiography
Procedia PDF Downloads 3221823 Infrastructure Investment Law Formulation to Ensure Low Transaction Cost at Policy Level: Case Study of Public Private Partnership Project at the Ministry of Public Works and Housing of the Republic of Indonesia
Authors: Yolanda Indah Permatasari, Sudarsono Hardjosoekarto
Abstract:
Public private partnership (PPP) scheme was considered as an alternative source of funding for infrastructure provision. However, the performance of PPP scheme and interest of private sector to participate in the provision of infrastructure was still practically low. This phenomenon motivates the research to reconstruct the form of collaborative governance at the policy level from the perspective of transaction cost of the PPP scheme. Soft-system methodology (SSM)-based action research was used as this research methodology. The result of this study concludes that the emergence of transaction cost sources at the policy level is caused by the absence of a law that governs infrastructure investment, especially the implementation of PPP scheme. This absence is causing the imbalance in risk allocation and risk mitigation between the public and private sector. Thus, this research recommended the formulation of infrastructure investment law that aims to minimize asymmetry information, to anticipate the principal-principal problems, and to provide legal basis that ensures risk certainty and guarantee fair risk allocation between public and private sector.Keywords: public governance, public private partnership, soft system methodology, transaction cost
Procedia PDF Downloads 1401822 Current Status of Ir-192 Brachytherapy in Bangladesh
Authors: M. Safiqul Islam, Md Arafat Hossain Sarkar
Abstract:
Brachytherapy is one of the most important cancer treatment management systems in radiotherapy department. Brachytherapy treatment is moved into High Dose Rate (HDR) after loader from Low Dose Rate (LDR) after loader due to radiation protection advantage. HDR Brachytherapy is a highly multipurpose system for enhancing cure and achieving palliation in many common cancers disease of developing countries. High-dose rate (HDR) Brachytherapy is a type of internal radiation therapy that delivers radiation from implants placed close to or inside, the tumor(s) in the body. This procedure is very effective at providing localized radiation to the tumor site while minimizing the patient’s whole body dose. Brachytherapy has proven to be a highly successful treatment for cancers of the prostate, cervix, endometrium, breast, skin, bronchus, esophagus, and head and neck, as well as soft tissue sarcomas and several other types of cancer. For the time being in our country we have 10 new HDR Remote after loading Brachytherapy. Right now 4 HDR Brachytherapy is already installed and running for patient’s treatment out of 10 HDR Brachytherapy. Ir-192 source is more comfortable than Co-60. In that case people or expert personnel prefer Ir-192 source for different kind of cancer patients. Ir-192 are economically, more flexible and familiar in our country.Keywords: Ir-192, brachytherapy, cancer treatment, prostate, cervix, endometrium, breast, skin, bronchus, esophagus, soft tissue sarcomas
Procedia PDF Downloads 4311821 Theoretical Method for Full Ab-Initio Calculation of Rhenium Carbide Compound
Abstract:
First principles calculations are carried out to investigate the structural, electronic, and elastic properties of the utraincompressible materials, namely, noble metal carbide of Rhenium carbide (ReC) in four phases, the rocksalt (NaCl-B1), zinc blende (ZB-B2), the tungsten carbide(Bh) (WC), and the nickel arsenide (NiAs-B8).The ground state properties such as the equilibrium lattice constant, elastic constants, the bulk modulus its pressure derivate, and the hardness of ReC in these phases are systematically predicted by calculations from first–principles. The corresponding calculated bulk modulus is comparable with that of diamond, especially for the B8 –type rhenium carbide (ReC), the incompressibility along the c axis is demonstrated to exceed the linear incompressibility of diamond. Our calculations confirm in the nickel arsenide (B8) structure the ReC is found to be stable with a large bulk modulus B=440 GPa and the tungsten carbide (WC) structure becomes the most more favourable with to respect B3 and B1 structures, which ReC- WC is meta-stable. Furthermore, the highest bulk modulus values in the zinc blende (B3), rock salt (B1), tungsten carbide (WC), and the nickel arsenide (B8) structures (294GPa, 401GPa, 415GPa and 447 GPa, respectively) indicates that ReC is a hard material, and is superhard compound H(B8)= 36 GPa compared with the H(diamond)=96 GPa and H(c BN)=63.10 GPa.Keywords: DFT, FP-LMTO, mechanical properties, elasticity, high pressure, thermodynamic properties, hard material
Procedia PDF Downloads 4411820 Determination the Effects of Physico-Chemical Parameters on Groundwater Status by Water Quality Index
Authors: Samaneh Abolli, Mahdi Ahmadi Nasab, Kamyar Yaghmaeian, Mahmood Alimohammadi
Abstract:
The quality of drinking water, in addition to the presence of physicochemical parameters, depends on the type and geographical location of water sources. In this study, groundwater quality was investigated by sampling total dissolved solids (TDS), electrical conductivity (EC), total hardness (TH), Cl, Ca²⁺, and Mg²⁺ parameters in 13 sites, and 40 water samples were sent to the laboratory. Electrometric, titration, and spectrophotometer methods were used. In the next step, the water quality index (WQI) was used to investigate the impact and weight of each parameter in the groundwater. The results showed that only the mean of magnesium ion (40.88 mg/l) was lower than the guidelines of World Health Organization (WHO). Interpreting the WQI based on the WHO guidelines showed that the statuses of 21, 11, and 7 samples were very poor, poor, and average quality, respectively, and one sample had excellent quality. Among the studied parameters, the means of EC (2,087.49 mS/cm) and Cl (1,015.87 mg/l) exceeded the global and national limits. Classifying water quality of TH was very hard (87.5%), hard (7.5%), and moderate (5%), respectively. Based on the geographical distribution, the drinking water index in sites 4 and 11 did not have acceptable quality. Chloride ion was identified as the responsible pollutant and the most important ion for raising the index. The outputs of statistical tests and Spearman correlation had significant and direct correlation (p < 0.05, r > 0.7) between TDS, EC, and chloride, EC and chloride, as well as TH, Ca²⁺, and Mg²⁺.Keywords: water quality index, groundwater, chloride, GIS, Garmsar
Procedia PDF Downloads 1021819 Microfluidized Fiber Based Oleogels for Encapsulation of Lycopene
Authors: Behic Mert
Abstract:
This study reports a facile approach to structure soft solids from microfluidizer lycopene-rich plant based structure and oil. First carotenoid-rich plant material (pumpkin was used in this study) processed with high-pressure microfluidizer to release lycopene molecules, then an emulsion was formed by mixing processed plant material and oil. While, in emulsion state lipid soluble carotenoid molecules were allowed to dissolve in the oil phase, the fiber material of plant material provided the network which was required for emulsion stabilization. Additional hydrocolloids (gelatin, xhantan, and pectin) up to 0.5% were also used to reinforce the emulsion stability and their impact on final product properties were evaluated via rheological, textural and oxidation studies. Finally, water was removed from emulsion phase by drying in a tray dryer at 40°C for 36 hours, and subsequent shearing resulted in soft solid (ole gel) structures. The microstructure of these systems was revealed by cryo-scanning electron microscopy. Effect of hydrocolloids on total lycopene and surface lycopene contents were also evaluated. The surface lycopene was lowest in gelatin containing oleo gels and highest in pectin-containing oleo gels. This study outlines the novel emulsion-based structuring method that can be used to encapsulate lycopene without the need of separate extraction of them.Keywords: lycopene, encapsulation, fiber, oleo gel
Procedia PDF Downloads 2661818 Neural Networks-based Acoustic Annoyance Model for Laptop Hard Disk Drive
Authors: Yichao Ma, Chengsiong Chin, Wailok Woo
Abstract:
Since the last decade, there has been a rapid growth in digital multimedia, such as high-resolution media files and three-dimentional movies. Hence, there is a need for large digital storage such as Hard Disk Drive (HDD). As such, users expect to have a quieter HDD in their laptop. In this paper, a jury test has been conducted on a group of 34 people where 17 of them are students who is the potential consumer, and the remaining are engineers who know the HDD. A total 13 HDD sound samples have been selected from over hundred HDD noise recordings. These samples are selected based on an agreed subjective feeling. The samples are played to the participants using head acoustic playback system which enabled them to experience as similar as possible the same environment as have been recorded. Analysis has been conducted and the obtained results have indicated different group has different perception over the noises. Two neural network-based acoustic annoyance models are established based on back propagation neural network. Four psychoacoustic metrics, loudness, sharpness, roughness and fluctuation strength, are used as the input of the model, and the subjective evaluation results are taken as the output. The developed models are reasonably accurate in simulating both training and test samples.Keywords: hdd noise, jury test, neural network model, psychoacoustic annoyance
Procedia PDF Downloads 4381817 Application of Chemical Tests for the Inhibition of Scaling From Hamma Hard Waters
Authors: Samira Ghizellaoui, Manel Boumagoura
Abstract:
Calcium carbonate precipitation is a widespread problem, especially in hard water systems. The main water supply that supplies the city of Constantine with drinking water is underground water called Hamma water. This water has a very high hardness of around 590 mg/L CaCO₃. This leads to the formation of scale, consisting mainly of calcium carbonate, which can be responsible for the clogging of valves and the deterioration of equipment (water heaters, washing machines and encrustations in the pipes). Plant extracts used as scale inhibitors have attracted the attention of several researchers. In recent years, green inhibitors have attracted great interest because they are biodegradable, non-toxic and do not affect the environment. The aim of our work is to evaluate the effectiveness of a chemical antiscale treatment in the presence of three green inhibitors: gallicacid; quercetin; alginate, and three mixtures: (gallic acid-quercetin); (quercetin-alginate); (gallic acid-alginate). The results show that the inhibitory effect is manifested from an addition of 1mg/L of gallic acid, 10 mg/L of quercetin, 0.2 mg/L of alginate, 0.4mg/L of (gallic acid-quercetin), 2mg/L of (quercetin-alginate) and 0.4 mg/L of (gallic acid-alginate). On the other hand, 100 mg/L (Drinking water standard) of Ca2+is reached for partial softening at 4 mg/L of gallic acid, 40 mg/L of quercetin, 0.6mg/L of alginate, 4mg/L of (gallic acid-quercetin), 10mg/L of (quercetin-alginate) and 1.6 mg/L of (gallic acid-alginate).Keywords: water, scaling, calcium carbonate, green inhibitor
Procedia PDF Downloads 681816 Development of Wear Resistant Ceramic Coating on Steel Using High Velocity Oxygen Flame Thermal Spray
Authors: Abhijit Pattnayak, Abhijith N.V, Deepak Kumar, Jayant Jain, Vijay Chaudhry
Abstract:
Hard and dense ceramic coatings deposited on the surface provide the ideal solution to the poor tribological properties exhibited by some popular stainless steels like EN-36, 17-4PH, etc. These steels are widely used in nuclear, fertilizer, food processing, and marine industries under extreme environmental conditions. The present study focuses on the development of Al₂O₃-CeO₂-rGO-based coatings on the surface of 17-4PH steel using High-Velocity Oxygen Flame (HVOF) thermal spray process. The coating is developed using an oxyacetylene flame. Further, we report the physical (Density, Surface roughness, Surface energetics), Metallurgical (Scanning electron microscopy, X-ray diffraction, Raman), Mechanical (Hardness(Vickers and Nano Hard-ness)), Tribological (Wear, Scratch hardness) and Chemical (corrosion) characterization of both As-sprayed coating and the Substrate (17-4 PH steel). The comparison of the properties will help us to understand the microstructure-property relationship of the coating and reveal the necessity and challenges of such coatings.Keywords: thermal spray process, HVOF, ceramic coating, hardness, wear, corrosion
Procedia PDF Downloads 941815 Internet Optimization by Negotiating Traffic Times
Authors: Carlos Gonzalez
Abstract:
This paper describes a system to optimize the use of the internet by clients requiring downloading of videos at peak hours. The system consists of a web server belonging to a provider of video contents, a provider of internet communications and a software application running on a client’s computer. The client using the application software will communicate to the video provider a list of the client’s future video demands. The video provider calculates which videos are going to be more in demand for download in the immediate future, and proceeds to request the internet provider the most optimal hours to do the downloading. The times of the downloading will be sent to the application software, which will use the information of pre-established hours negotiated between the video provider and the internet provider to download those videos. The videos will be saved in a special protected section of the user’s hard disk, which will only be accessed by the application software in the client’s computer. When the client is ready to see a video, the application will search the list of current existent videos in the area of the hard disk; if it does exist, it will use this video directly without the need for internet access. We found that the best way to optimize the download traffic of videos is by negotiation between the internet communication provider and the video content provider.Keywords: internet optimization, video download, future demands, secure storage
Procedia PDF Downloads 1361814 Using Industrial Service Quality to Assess Service Quality Perception in Television Advertisement: A Case Study
Authors: Ana L. Martins, Rita S. Saraiva, João C. Ferreira
Abstract:
Much effort has been placed on the assessment of perceived service quality. Several models can be found in literature, but these are mainly focused on business-to-consumer (B2C) relationships. Literature on how to assess perceived quality in business-to-business (B2B) contexts is scarce both conceptually and in terms of its application. This research aims at filling this gap in literature by applying INDSERV to a case study situation. Under this scope, this research aims at analyzing the adequacy of the proposed assessment tool to other context besides the one where it was developed and by doing so analyzing the perceive quality of the advertisement service provided by a specific television network to its B2B customers. The INDSERV scale was adopted and applied to a sample of 33 clients, via questionnaires adapted to interviews. Data was collected in person or phone. Both quantitative and qualitative data collection was performed. Qualitative data analysis followed content analysis protocol. Quantitative analysis used hypotheses testing. Findings allowed to conclude that the perceived quality of the television service provided by television network is very positive, being the Soft Process Quality the parameter that reveals the highest perceived quality of the service as opposed to Potential Quality. To this end, some comments and suggestions were made by the clients regarding each one of these service quality parameters. Based on the hypotheses testing, it was noticed that only advertisement clients that maintain a connection to the television network from 5 to 10 years do show a significant different perception of the TV advertisement service provided by the company in what the Hard Process Quality parameter is concerned. Through the collected data content analysis, it was possible to obtain the percentage of clients which share the same opinions and suggestions for improvement. Finally, based on one of the four service quality parameter in a B2B context, managerial suggestions were developed aiming at improving the television network advertisement perceived quality service.Keywords: B2B, case study, INDSERV, perceived service quality
Procedia PDF Downloads 2061813 Geotechnical and Mineralogical Properties of Clay Soils in the Second Organized Industrial Region, Konya, Turkey
Authors: Mustafa Yıldız, Ali Ulvi Uzer, Murat Olgun
Abstract:
In this study, geotechnical and mineralogical properties of gypsum containing clay basis which form the ground of Second Organized Industrial Zone in Konya province have been researched through comprehensive field and laboratory experiments. Although sufficient geotechnical research has not been performed yet, an intensive structuring in the region continues at present. The study area consists of mid-lake sediments formed by gypsum containing soft silt-clay basis which evolves to a large area. To determine the soil profile and geotechnical specifications; 18 drilling holes were opened and disturbed / undisturbed soil samples have been taken through shelby tubes within 1.5m intervals. Tests have been performed on these samples to designate the index and strength properties of soil. Besides, at all drilling holes Standart Penetration Tests have been done within 1.5m intervals. For the purpose of determining the mineralogical characteristics of the soil; all rock and X-RD analysis have been carried out on 6 samples which were taken from various depths through the soil profile. Strength and compressibility characteristics of the soil were defined with correlations using laboratory and field test results. Unconfined compressive strength, undrained cohesion, compression index varies between 16 kN/m2 and 405.4 kN/m2, 6.5 kN/m2 and 72 kN/m2, 0.066 and 0.864, respectively.Keywords: Konya second organized industrial region, strength, compressibility, soft clay
Procedia PDF Downloads 3091812 Prediction of Temperature Distribution during Drilling Process Using Artificial Neural Network
Authors: Ali Reza Tahavvor, Saeed Hosseini, Nazli Jowkar, Afshin Karimzadeh Fard
Abstract:
Experimental & numeral study of temperature distribution during milling process, is important in milling quality and tools life aspects. In the present study the milling cross-section temperature is determined by using Artificial Neural Networks (ANN) according to the temperature of certain points of the work piece and the points specifications and the milling rotational speed of the blade. In the present work, at first three-dimensional model of the work piece is provided and then by using the Computational Heat Transfer (CHT) simulations, temperature in different nods of the work piece are specified in steady-state conditions. Results obtained from CHT are used for training and testing the ANN approach. Using reverse engineering and setting the desired x, y, z and the milling rotational speed of the blade as input data to the network, the milling surface temperature determined by neural network is presented as output data. The desired points temperature for different milling blade rotational speed are obtained experimentally and by extrapolation method for the milling surface temperature is obtained and a comparison is performed among the soft programming ANN, CHT results and experimental data and it is observed that ANN soft programming code can be used more efficiently to determine the temperature in a milling process.Keywords: artificial neural networks, milling process, rotational speed, temperature
Procedia PDF Downloads 4051811 Ni-W-P Alloy Coating as an Alternate to Electroplated Hard Cr Coating
Authors: S. K. Ghosh, C. Srivastava, P. K. Limaye, V. Kain
Abstract:
Electroplated hard chromium is widely known in coatings and surface finishing, automobile and aerospace industries because of its excellent hardness, wear resistance and corrosion properties. However, its precursor, Cr+6 is highly carcinogenic in nature and a consensus has been adopted internationally to eradicate this coating technology with an alternative one. The search for alternate coatings to electroplated hard chrome is continuing worldwide. Various alloys and nanocomposites like Co-W alloys, Ni-Graphene, Ni-diamond nanocomposites etc. have already shown promising results in this regard. Basically, in this study, electroless Ni-P alloys with excellent corrosion resistance was taken as the base matrix and incorporation of tungsten as third alloying element was considered to improve the hardness and wear resistance of the resultant alloy coating. The present work is focused on the preparation of Ni–W–P coatings by electrodeposition with different content of phosphorous and its effect on the electrochemical, mechanical and tribological performances. The results were also compared with Ni-W alloys. Composition analysis by EDS showed deposition of Ni-32.85 wt% W-3.84 wt% P (designated as Ni-W-LP) and Ni-18.55 wt% W-8.73 wt% P (designated as Ni-W-HP) alloy coatings from electrolytes containing of 0.006 and 0.01M sodium hypophosphite respectively. Inhibition of tungsten deposition in the presence of phosphorous was noted. SEM investigation showed cauliflower like growth along with few microcracks. The as-deposited Ni-W-P alloy coating was amorphous in nature as confirmed by XRD investigation and step-wise crystallization was noticed upon annealing at higher temperatures. For all the coatings, the nanohardness was found to increase after heat-treatment and typical nanonahardness values obtained for 400°C annealed samples were 18.65±0.20 GPa, 20.03±0.25 GPa, and 19.17±0.25 for alloy coatings Ni-W, Ni-W-LP and Ni-W-HP respectively. Therefore, the nanohardness data show very promising results. Wear and coefficient of friction data were recorded by applying a different normal load in reciprocating motion using a ball on plate geometry. Post experiment, the wear mechanism was established by detail investigation of wear-scar morphology. Potentiodynamic measurements showed coating with a high content of phosphorous was most corrosion resistant in 3.5wt% NaCl solution.Keywords: corrosion, electrodeposition, nanohardness, Ni-W-P alloy coating
Procedia PDF Downloads 3481810 Characterization of Mg/Sc System for X-Ray Spectroscopy in the Water Window Range
Authors: Hina Verma, Karine Le Guen, Mohammed H. Modi, Rajnish Dhawan, Philippe Jonnard
Abstract:
Periodic multilayer mirrors have potential application as optical components in X-ray microscopy, particularly working in the water window region. The water window range, located between the absorption edges of carbon (285 eV) and oxygen (530eV), along with the presence of nitrogen K absorption edge (395 eV), makes it a powerful method for imaging biological samples due to the natural optical contrast between water and carbon. We characterized bilayer, trilayer, quadrilayer, and multilayer systems of Mg/Sc with ZrC thin layers introduced as a barrier layer and capping layer prepared by ion beam sputtering. The introduction of ZrC as a barrier layer is expected to improve the structure of the Mg/Sc system. The ZrC capping layer also prevents the stack from oxidation. The structural analysis of the Mg/Sc systems was carried out by using grazing incidence X-ray reflectivity (GIXRR) to obtain non-destructively a first description of the structural parameters, thickness, roughness, and density of the layers. Resonant soft X-ray reflectivity measurements in the vicinity of Sc L-absorption edge were performed to investigate and quantify the atomic distribution of deposited layers. Near absorption edge, the atomic scattering factor of an element changes sharply depending on its chemical environment inside the structure.Keywords: buried interfaces, resonant soft X-ray reflectivity, X-ray optics, X-ray reflectivity
Procedia PDF Downloads 1771809 Identity and Mental Adaptation of Deaf and Hard-of-Hearing Students
Authors: N. F. Mikhailova, M. E. Fattakhova, M. A. Mironova, E. V. Vyacheslavova
Abstract:
For the mental and social adaptation of the deaf and hard-of-hearing people, cultural and social aspects - the formation of identity (acculturation) and educational conditions – are highly significant. We studied 137 deaf and hard-of-hearing students in different educational situations. We used these methods: Big Five (Costa & McCrae, 1997), TRF (Becker, 1989), WCQ (Lazarus & Folkman, 1988), self-esteem, and coping strategies (Jambor & Elliott, 2005), self-stigma scale (Mikhailov, 2008). Type of self-identification of students depended on the degree of deafness, type of education, method of communication in the family: large hearing loss, education in schools for deaf, and gesture communication increased the likelihood of a 'deaf' acculturation. Less hearing loss, inclusive education in public school or school for the hearing-impaired, mixed communication in the family contributed to the formation of 'hearing' acculturation. The choice of specific coping depended on the degree of deafness: a large hearing loss increased coping 'withdrawal into the deaf world' and decreased 'bicultural skills' coping. People with mild hearing loss tended to cover-up it. In the context of ongoing discussion, we researched personality characteristics in deaf and hard on-hearing students, coping and other deafness associated factors depending on their acculturation type. Students who identified themselves with the 'hearing world' had a high self-esteem, a higher level of extraversion, self-awareness, personal resources, willingness to cooperate, better psychological health, emotional stability, higher ability to empathy, a greater satiety of life with feelings and sense and high sense of self-worth. They also actively used strategies, problem-solving, acceptance of responsibility, positive revaluation. Student who limited themselves within the culture of deaf people had more severe hearing loss and accordingly had more communication barriers. Lack of use or seldom use of coping strategies by these students point at decreased level of stress in their life. Their self-esteem have not been challenged in the specific social environment of the students with the same severity of defect, and thus this environment provided sense of comfort (we can assume that from the high scores on psychological health, personality resources, and emotional stability). Students with bicultural acculturation had higher level of psychological resources - they used Positive Reappraisal coping more often and had a higher level of psychological health. Lack of belonging to certain culture (marginality) leads to personality disintegration, social and psychological disadaptation: deaf and hard-of-hearing students with marginal identification had a lower self-estimation level, worse psychological health and personal resources, lower level of extroversion, self-confidence and life satisfaction. They, in fact, become 'risk group' (many of them dropped out of universities, divorced, and one even ended up in the ranks of ISIS). All these data argue the importance of cultural 'anchor' for people with hearing deprivation. Supported by the RFBR No 19-013-00406.Keywords: acculturation, coping, deafness, marginality
Procedia PDF Downloads 2041808 Analysis of Secondary Peak in Hα Emission Profile during Gas Puffing in Aditya Tokamak
Authors: Harshita Raj, Joydeep Ghosh, Rakesh L. Tanna, Prabal K. Chattopadhyay, K. A. Jadeja, Sharvil Patel, Kaushal M. Patel, Narendra C. Patel, S. B. Bhatt, V. K. Panchal, Chhaya Chavda, C. N. Gupta, D. Raju, S. K. Jha, J. Raval, S. Joisa, S. Purohit, C. V. S. Rao, P. K. Atrey, Umesh Nagora, R. Manchanda, M. B. Chowdhuri, Nilam Ramaiya, S. Banerjee, Y. C. Saxena
Abstract:
Efficient gas fueling is a critical aspect that needs to be mastered in order to maintain plasma density, to carry out fusion. This requires a fair understanding of fuel recycling in order to optimize the gas fueling. In Aditya tokamak, multiple gas puffs are used in a precise and controlled manner, for hydrogen fueling during the flat top of plasma discharge which has been instrumental in achieving discharges with enhanced density as well as energy confinement time. Following each gas puff, we observe peaks in temporal profile of Hα emission, Soft X-ray (SXR) and chord averaged electron density in a number of discharges, indicating efficient gas fueling. Interestingly, Hα temporal profile exhibited an additional peak following the peak corresponding to each gas puff. These additional peak Hα appeared in between the two gas puffs, indicating the presence of a secondary hydrogen source apart from the gas puffs. A thorough investigation revealed that these secondary Hα peaks coincide with Hard X- ray bursts which come from the interaction of runaway electrons with vessel limiters. This leads to consider that the runaway electrons (REs), which hit the wall, in turn, bring out the absorbed hydrogen and oxygen from the wall and makes the interaction of REs with limiter a secondary hydrogen source. These observations suggest that runaway electron induced recycling should also be included in recycling particle source in the particle balance calculations in tokamaks. Observation of two Hα peaks associated with one gas puff and their roles in enhancing and maintaining plasma density in Aditya tokamak will be discussed in this paper.Keywords: fusion, gas fueling, recycling, Tokamak, Aditya
Procedia PDF Downloads 4021807 Comfort Sensor Using Fuzzy Logic and Arduino
Authors: Samuel John, S. Sharanya
Abstract:
Automation has become an important part of our life. It has been used to control home entertainment systems, changing the ambience of rooms for different events etc. One of the main parameters to control in a smart home is the atmospheric comfort. Atmospheric comfort mainly includes temperature and relative humidity. In homes, the desired temperature of different rooms varies from 20 °C to 25 °C and relative humidity is around 50%. However, it varies widely. Hence, automated measurement of these parameters to ensure comfort assumes significance. To achieve this, a fuzzy logic controller using Arduino was developed using MATLAB. Arduino is an open source hardware consisting of a 24 pin ATMEGA chip (atmega328), 14 digital input /output pins and an inbuilt ADC. It runs on 5v and 3.3v power supported by a board voltage regulator. Some of the digital pins in Aruduino provide PWM (pulse width modulation) signals, which can be used in different applications. The Arduino platform provides an integrated development environment, which includes support for c, c++ and java programming languages. In the present work, soft sensor was introduced in this system that can indirectly measure temperature and humidity and can be used for processing several measurements these to ensure comfort. The Sugeno method (output variables are functions or singleton/constant, more suitable for implementing on microcontrollers) was used in the soft sensor in MATLAB and then interfaced to the Arduino, which is again interfaced to the temperature and humidity sensor DHT11. The temperature-humidity sensor DHT11 acts as the sensing element in this system. Further, a capacitive humidity sensor and a thermistor were also used to support the measurement of temperature and relative humidity of the surrounding to provide a digital signal on the data pin. The comfort sensor developed was able to measure temperature and relative humidity correctly. The comfort percentage was calculated and accordingly the temperature in the room was controlled. This system was placed in different rooms of the house to ensure that it modifies the comfort values depending on temperature and relative humidity of the environment. Compared to the existing comfort control sensors, this system was found to provide an accurate comfort percentage. Depending on the comfort percentage, the air conditioners and the coolers in the room were controlled. The main highlight of the project is its cost efficiency.Keywords: arduino, DHT11, soft sensor, sugeno
Procedia PDF Downloads 3121806 The Beam Expansion Method, A Simplified and Efficient Approach of Field Propagation and Resonators Modes Study
Authors: Zaia Derrar Kaddour
Abstract:
The study of a beam throughout an optical path is generally achieved by means of diffraction integral. Unfortunately, in some problems, this tool turns out to be not very friendly and hard to implement. Instead, the beam expansion method for computing field profiles appears to be an interesting alternative. The beam expansion method consists of expanding the field pattern as a series expansion in a set of orthogonal functions. Propagating each individual component through a circuit and adding up the derived elements leads easily to the result. The problem is then reduced to finding how the expansion coefficients change in a circuit. The beam expansion method requires a systematic study of each type of optical element that can be met in the considered optical path. In this work, we analyze the following fundamental elements: first order optical systems, hard apertures and waveguides. We show that the former element type is completely defined thanks to the Gouy phase shift expression we provide and the latters require a suitable mode conversion. For endorsing the usefulness and relevance of the beam expansion approach, we show here some of its applications such as the treatment of the thermal lens effect and the study of unstable resonators.Keywords: gouy phase shift, modes, optical resonators, unstable resonators
Procedia PDF Downloads 621805 Quantitative Assessment of Soft Tissues by Statistical Analysis of Ultrasound Backscattered Signals
Authors: Da-Ming Huang, Ya-Ting Tsai, Shyh-Hau Wang
Abstract:
Ultrasound signals backscattered from the soft tissues are mainly depending on the size, density, distribution, and other elastic properties of scatterers in the interrogated sample volume. The quantitative analysis of ultrasonic backscattering is frequently implemented using the statistical approach due to that of backscattering signals tends to be with the nature of the random variable. Thus, the statistical analysis, such as Nakagami statistics, has been applied to characterize the density and distribution of scatterers of a sample. Yet, the accuracy of statistical analysis could be readily affected by the receiving signals associated with the nature of incident ultrasound wave and acoustical properties of samples. Thus, in the present study, efforts were made to explore such effects as the ultrasound operational modes and attenuation of biological tissue on the estimation of corresponding Nakagami statistical parameter (m parameter). In vitro measurements were performed from healthy and pathological fibrosis porcine livers using different single-element ultrasound transducers and duty cycles of incident tone burst ranging respectively from 3.5 to 7.5 MHz and 10 to 50%. Results demonstrated that the estimated m parameter tends to be sensitively affected by the use of ultrasound operational modes as well as the tissue attenuation. The healthy and pathological tissues may be characterized quantitatively by m parameter under fixed measurement conditions and proper calibration.Keywords: ultrasound backscattering, statistical analysis, operational mode, attenuation
Procedia PDF Downloads 3231804 Balancing a Rotary Inverted Pendulum System Using Robust Generalized Dynamic Inverse: Design and Experiment
Authors: Ibrahim M. Mehedi, Uzair Ansari, Ubaid M. Al-Saggaf, Abdulrahman H. Bajodah
Abstract:
This paper presents a methodology for balancing a rotary inverted pendulum system using Robust Generalized Dynamic Inversion (RGDI) under influence of parametric variations and external disturbances. In GDI control, dynamic constraints are formulated in the form of asymptotically stable differential equation which encapsulates the control objectives. The constraint differential equations are based on the deviation function of the angular position and its rates from their reference values. The constraint dynamics are inverted using Moore-Penrose Generalized Inverse (MPGI) to realize the control expression. The GDI singularity problem is addressed by augmenting a dynamic scale factor in the interpretation of MPGI which guarantee asymptotically stable position tracking. An additional term based on Sliding Mode Control is appended within GDI control to make it robust against parametric variations, disturbances and tracking performance deterioration due to generalized inversion scaling. The stability of the closed loop system is ensured by using positive definite Lyapunov energy function that guarantees semi-global practically stable position tracking. Numerical simulations are conducted on the dynamic model of rotary inverted pendulum system to analyze the efficiency of proposed RGDI control law. The comparative study is also presented, in which the performance of RGDI control is compared with Linear Quadratic Regulator (LQR) and is verified through experiments. Numerical simulations and real-time experiments demonstrate better tracking performance abilities and robustness features of RGDI control in the presence of parametric uncertainties and disturbances.Keywords: generalized dynamic inversion, lyapunov stability, rotary inverted pendulum system, sliding mode control
Procedia PDF Downloads 1721803 Stabilisation of a Soft Soil by Alkaline Activation
Authors: Mohammadjavad Yaghoubi, Arul Arulrajah, Mahdi M. Disfani, Suksun Horpibulsuk, Myint W. Bo, Stephen P. Darmawan
Abstract:
This paper investigates the changes in the strength development of a high water content soft soil stabilised with alkaline activation of fly ash (FA) to use in deep soil mixing (DSM) technology. The content of FA was 20% by dry mass of soil, and the alkaline activator was sodium silicate (Na2SiO3). Samples were cured for 3, 7, 14, 28 and 56 days to evaluate the effect of curing time on strength development. To study the effect of adding slag (S) to the mixture on the strength development, 5% S was replaced with FA. In addition, the effect of the initial unit weight of samples on strength development was studied by preparing specimens with two different static compaction stresses. This was to replicate the field conditions where during implementing the DSM technique, the pressure on the soil while being mixed, increases with depth. Unconfined compression strength (UCS), scanning electron microscopy (SEM) and energy-dispersive X-ray spectroscopy (EDS) tests were conducted on the specimens. The results show that adding S to the FA based geopolymer activated by Na2SiO3 decreases the strength. Furthermore, samples prepared at a higher unit weight demonstrate greater strengths. Moreover, samples prepared at lower unit weight reached their final strength at about 14 days of curing, whereas the strength development continues to 56 days for specimens prepared at a higher unit weight.Keywords: alkaline activation, curing time, fly ash, geopolymer, slag
Procedia PDF Downloads 3381802 A Structural and Magnetic Investigation of the Inversion Degree in Spinel NiFe2O4, ZnFe2O4 and Ni0.5Zn0.5Fe2O4 Ferrites Prepared by Soft Mechanochemical Synthesis
Authors: Z. Ž. Lazarević, D. L. Sekulić, V. N. Ivanovski, N. Ž. Romčević
Abstract:
NiFe2O4 (nickel ferrite), ZnFe2O4 (zinc ferrite) and Ni0.5Zn0.5Fe2O4 (nickel-zinc ferrite) were prepared by mechanochemical route in a planetary ball mill starting from mixture of the appropriate quantities of the Ni(OH)2/Fe(OH)3, Zn(OH)2/Fe(OH)3 and Ni(OH)2/Zn(OH)2/Fe(OH)3 hydroxide powders. In order to monitor the progress of chemical reaction and confirm phase formation, powder samples obtained after 25 h, 18 h and 10 h of milling were characterized by X-ray diffraction (XRD), transmission electron microscopy (TEM), IR, Raman and Mössbauer spectroscopy. It is shown that the soft mechanochemical method, i.e. mechanochemical activation of hydroxides, produces high quality single phase ferrite samples in much more efficient way. From the IR spectroscopy of single phase samples it is obvious that energy of modes depends on the ratio of cations. It is obvious that all samples have more than 5 Raman active modes predicted by group theory in the normal spinel structure. Deconvolution of measured spectra allows one to conclude that all complex bands in the spectra are made of individual peaks with the intensities that vary from spectrum to spectrum. The deconvolution of Raman spectra allows to separate contributions of different cations to a particular type of vibration and to estimate the degree of inversion.Keywords: ferrites, Raman spectroscopy, IR spectroscopy, Mössbauer measurements
Procedia PDF Downloads 4541801 Validation of Mapping Historical Linked Data to International Committee for Documentation (CIDOC) Conceptual Reference Model Using Shapes Constraint Language
Authors: Ghazal Faraj, András Micsik
Abstract:
Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) language, provides well-defined shapes and RDF graphs, named "shape graphs". These shape graphs validate other resource description framework (RDF) graphs which are called "data graphs". The structural features of SHACL permit generating a variety of conditions to evaluate string matching patterns, value type, and other constraints. Moreover, the framework of SHACL supports high-level validation by expressing more complex conditions in languages such as SPARQL protocol and RDF Query Language (SPARQL). SHACL includes two parts: SHACL Core and SHACL-SPARQL. SHACL Core includes all shapes that cover the most frequent constraint components. While SHACL-SPARQL is an extension that allows SHACL to express more complex customized constraints. Validating the efficacy of dataset mapping is an essential component of reconciled data mechanisms, as the enhancement of different datasets linking is a sustainable process. The conventional validation methods are the semantic reasoner and SPARQL queries. The former checks formalization errors and data type inconsistency, while the latter validates the data contradiction. After executing SPARQL queries, the retrieved information needs to be checked manually by an expert. However, this methodology is time-consuming and inaccurate as it does not test the mapping model comprehensively. Therefore, there is a serious need to expose a new methodology that covers the entire validation aspects for linking and mapping diverse datasets. Our goal is to conduct a new approach to achieve optimal validation outcomes. The first step towards this goal is implementing SHACL to validate the mapping between the International Committee for Documentation (CIDOC) conceptual reference model (CRM) and one of its ontologies. To initiate this project successfully, a thorough understanding of both source and target ontologies was required. Subsequently, the proper environment to run SHACL and its shape graphs were determined. As a case study, we performed SHACL over a CIDOC-CRM dataset after running a Pellet reasoner via the Protégé program. The applied validation falls under multiple categories: a) data type validation which constrains whether the source data is mapped to the correct data type. For instance, checking whether a birthdate is assigned to xsd:datetime and linked to Person entity via crm:P82a_begin_of_the_begin property. b) Data integrity validation which detects inconsistent data. For instance, inspecting whether a person's birthdate occurred before any of the linked event creation dates. The expected results of our work are: 1) highlighting validation techniques and categories, 2) selecting the most suitable techniques for those various categories of validation tasks. The next plan is to establish a comprehensive validation model and generate SHACL shapes automatically.Keywords: SHACL, CIDOC-CRM, SPARQL, validation of ontology mapping
Procedia PDF Downloads 2531800 Soft Computing Approach for Diagnosis of Lassa Fever
Authors: Roseline Oghogho Osaseri, Osaseri E. I.
Abstract:
Lassa fever is an epidemic hemorrhagic fever caused by the Lassa virus, an extremely virulent arena virus. This highly fatal disorder kills 10% to 50% of its victims, but those who survive its early stages usually recover and acquire immunity to secondary attacks. One of the major challenges in giving proper treatment is lack of fast and accurate diagnosis of the disease due to multiplicity of symptoms associated with the disease which could be similar to other clinical conditions and makes it difficult to diagnose early. This paper proposed an Adaptive Neuro Fuzzy Inference System (ANFIS) for the prediction of Lass Fever. In the design of the diagnostic system, four main attributes were considered as the input parameters and one output parameter for the system. The input parameters are Temperature on admission (TA), White Blood Count (WBC), Proteinuria (P) and Abdominal Pain (AP). Sixty-one percent of the datasets were used in training the system while fifty-nine used in testing. Experimental results from this study gave a reliable and accurate prediction of Lassa fever when compared with clinically confirmed cases. In this study, we have proposed Lassa fever diagnostic system to aid surgeons and medical healthcare practictionals in health care facilities who do not have ready access to Polymerase Chain Reaction (PCR) diagnosis to predict possible Lassa fever infection.Keywords: anfis, lassa fever, medical diagnosis, soft computing
Procedia PDF Downloads 2691799 Rd-PLS Regression: From the Analysis of Two Blocks of Variables to Path Modeling
Authors: E. Tchandao Mangamana, V. Cariou, E. Vigneau, R. Glele Kakai, E. M. Qannari
Abstract:
A new definition of a latent variable associated with a dataset makes it possible to propose variants of the PLS2 regression and the multi-block PLS (MB-PLS). We shall refer to these variants as Rd-PLS regression and Rd-MB-PLS respectively because they are inspired by both Redundancy analysis and PLS regression. Usually, a latent variable t associated with a dataset Z is defined as a linear combination of the variables of Z with the constraint that the length of the loading weights vector equals 1. Formally, t=Zw with ‖w‖=1. Denoting by Z' the transpose of Z, we define herein, a latent variable by t=ZZ’q with the constraint that the auxiliary variable q has a norm equal to 1. This new definition of a latent variable entails that, as previously, t is a linear combination of the variables in Z and, in addition, the loading vector w=Z’q is constrained to be a linear combination of the rows of Z. More importantly, t could be interpreted as a kind of projection of the auxiliary variable q onto the space generated by the variables in Z, since it is collinear to the first PLS1 component of q onto Z. Consider the situation in which we aim to predict a dataset Y from another dataset X. These two datasets relate to the same individuals and are assumed to be centered. Let us consider a latent variable u=YY’q to which we associate the variable t= XX’YY’q. Rd-PLS consists in seeking q (and therefore u and t) so that the covariance between t and u is maximum. The solution to this problem is straightforward and consists in setting q to the eigenvector of YY’XX’YY’ associated with the largest eigenvalue. For the determination of higher order components, we deflate X and Y with respect to the latent variable t. Extending Rd-PLS to the context of multi-block data is relatively easy. Starting from a latent variable u=YY’q, we consider its ‘projection’ on the space generated by the variables of each block Xk (k=1, ..., K) namely, tk= XkXk'YY’q. Thereafter, Rd-MB-PLS seeks q in order to maximize the average of the covariances of u with tk (k=1, ..., K). The solution to this problem is given by q, eigenvector of YY’XX’YY’, where X is the dataset obtained by horizontally merging datasets Xk (k=1, ..., K). For the determination of latent variables of order higher than 1, we use a deflation of Y and Xk with respect to the variable t= XX’YY’q. In the same vein, extending Rd-MB-PLS to the path modeling setting is straightforward. Methods are illustrated on the basis of case studies and performance of Rd-PLS and Rd-MB-PLS in terms of prediction is compared to that of PLS2 and MB-PLS.Keywords: multiblock data analysis, partial least squares regression, path modeling, redundancy analysis
Procedia PDF Downloads 1471798 Assessing an Instrument Usability: Response Interpolation and Scale Sensitivity
Authors: Betsy Ng, Seng Chee Tan, Choon Lang Quek, Peter Looker, Jaime Koh
Abstract:
The purpose of the present study was to determine the particular scale rating that stands out for an instrument. The instrument was designed to assess student perceptions of various learning environments, namely face-to-face, online and blended. The original instrument had a 5-point Likert items (1 = strongly disagree and 5 = strongly agree). Alternate versions were modified with a 6-point Likert scale and a bar scale rating. Participants consisted of undergraduates in a local university were involved in the usability testing of the instrument in an electronic setting. They were presented with the 5-point, 6-point and percentage-bar (100-point) scale ratings, in response to their perceptions of learning environments. The 5-point and 6-point Likert scales were presented in the form of radio button controls for each number, while the percentage-bar scale was presented with a sliding selection. Among these responses, 6-point Likert scale emerged to be the best overall. When participants were confronted with the 5-point items, they either chose 3 or 4, suggesting that data loss could occur due to the insensitivity of instrument. The insensitivity of instrument could be due to the discreet options, as evidenced by response interpolation. To avoid the constraint of discreet options, the percentage-bar scale rating was tested, but the participant responses were not well-interpolated. The bar scale might have provided a variety of responses without a constraint of a set of categorical options, but it seemed to reflect a lack of perceived and objective accuracy. The 6-point Likert scale was more likely to reflect a respondent’s perceived and objective accuracy as well as higher sensitivity. This finding supported the conclusion that 6-point Likert items provided a more accurate measure of the participant’s evaluation. The 5-point and bar scale ratings might not be accurately measuring the participants’ responses. This study highlighted the importance of the respondent’s perception of accuracy, respondent’s true evaluation, and the scale’s ease of use. Implications and limitations of this study were also discussed.Keywords: usability, interpolation, sensitivity, Likert scales, accuracy
Procedia PDF Downloads 406