Search results for: office computer users
1043 Evaluation of Housing Quality in the Urban Fringes of Ibadan, Nigeria
Authors: Amao Funmilayo Lanrewaju
Abstract:
The study examined the socio-economic characteristics of the residents in selected urban fringes of Ibadan; identified and examined the housing and neighbourhood characteristics and evaluated housing quality in the study area. It analysed the relationship between the socio-economic characteristics of the residents, housing and neighbourhood characteristics as well as housing quality in the study area. This was with a view to providing information that would enhance the housing quality in urban fringes of Ibadan. Primary and secondary data were used for the study. A survey of eleven purposively selected communities from Oluyole and Egbeda local government areas in the urban fringes was conducted through a questionnaire administration and expert rating by five independent assessors (Qualified Architects) using penalty scoring within similar time-frames. The study employed a random sampling method to select a sample size of 480 houses representing 5% of the sampling frame of 9600 houses. Respondent in the first house was selected randomly and subsequently every 20th house in the streets involved was systematically selected for questionnaire administration, usually a household-head per building. The structured questionnaire elicited information on socio-economic characteristics of the residents, housing and neighbourhood characteristics, factors affecting housing quality and housing quality in the study area. Secondary data obtained for the study included the land-use plan of Ibadan from previous publications, housing demographics, population figures from relevant institutions and other published materials. The data collected were analysed using descriptive and inferential statistics such as frequency distribution, Cross tabulation, Correlation Analysis, Analysis of Variance (ANOVA) and Relative Importance Index (RII). The result of the survey revealed that respondents from the Yoruba ethnic group constituted the majority, comprising 439 (91.5%) of the 480 respondents from the two local government areas selected. It also revealed that the type of tenure status of majority of the respondents in the two local government areas was self-ownership (234, 48.8%), while 44.0% of the respondents acquired their houses through personal savings. Cross tabulation indicated that majority (67.1%, 322 out of 480) of the respondents were low-income earners. The study showed that both housing and neighbourhood services were not adequately provided across neighbourhoods in the study area. Correlation analysis indicated a significant relationship between respondents’ socio–economic status and their general housing quality (r=0.46; p-value of 0.01< 0.05). The ANOVA indicated that the relationship between socio-economic characteristics of the residents, housing and neighbourhood characteristics in the study area was significant (F=18.289, p=0.00; the coefficient of determination R2= 0.192). The findings from the study however revealed that there was no significant difference in the results obtained from users based evaluation and expert rating. The study concluded that housing quality in the urban fringes of Ibadan is generally poor and the socio-economic status of the residents significantly influenced the housing quality.Keywords: housing quality, urban fringes, economic status, poverty
Procedia PDF Downloads 4431042 The Participation of Experts in the Criminal Policy on Drugs: The Proposal of a Cannabis Regulation Model in Spain by the Cannabis Policy Studies Group
Authors: Antonio Martín-Pardo
Abstract:
With regard to the context in which this paper is inserted, it is noteworthy that the current criminal policy model in which we find immersed, denominated by some doctrine sector as the citizen security model, is characterized by a marked tendency towards the discredit of expert knowledge. This type of technic knowledge has been displaced by the common sense and by the daily experience of the people at the time of legislative drafting, as well as by excessive attention to the short-term political effects of the law. Despite this criminal-political adverse scene, we still find valuable efforts in the side of experts to bring some rationality to the legislative development. This is the case of the proposal for a new cannabis regulation model in Spain carried out by the Cannabis Policy Studies Group (hereinafter referred as ‘GEPCA’). The GEPCA is a multidisciplinary group composed by authors with multiple/different orientations, trajectories and interests, but with a common minimum objective: the conviction that the current situation regarding cannabis is unsustainable and, that a rational legislative solution must be given to the growing social pressure for the regulation of their consumption and production. This paper details the main lines through which this technical proposal is developed with the purpose of its dissemination and discussion in the Congress. The basic methodology of the proposal is inductive-expository. In that way, firstly, we will offer a brief, but solid contextualization of the situation of cannabis in Spain. This contextualization will touch on issues such as the national regulatory situation and its relationship with the international context; the criminal, judicial and penitentiary impact of the offer and consumption of cannabis, or the therapeutic use of the substance, among others. In second place, we will get down to the business properly by detailing the minutia of the three main cannabis access channels that are proposed. Namely: the regulated market, the associations of cannabis users and personal self-cultivation. In each of these options, especially in the first two, special attention will be paid to both, the production and processing of the substance and the necessary administrative control of the activity. Finally, in a third block, some notes will be given on a series of subjects that surround the different access options just mentioned above and that give fullness and coherence to the proposal outlined. Among those related issues we find some such as consumption and tenure of the substance; the issue of advertising and promotion of cannabis; consumption in areas of special risk (work or driving v. g.); the tax regime; the need to articulate evaluation instruments for the entire process; etc. The main conclusion drawn from the analysis of the proposal is the unsustainability of the current repressive system, clearly unsuccessful, and the need to develop new access routes to cannabis that guarantee both public health and the rights of people who have freely chosen to consume it.Keywords: cannabis regulation proposal, cannabis policies studies group, criminal policy, expertise participation
Procedia PDF Downloads 1201041 Combination of Artificial Neural Network Model and Geographic Information System for Prediction Water Quality
Authors: Sirilak Areerachakul
Abstract:
Water quality has initiated serious management efforts in many countries. Artificial Neural Network (ANN) models are developed as forecasting tools in predicting water quality trend based on historical data. This study endeavors to automatically classify water quality. The water quality classes are evaluated using 6 factor indices. These factors are pH value (pH), Dissolved Oxygen (DO), Biochemical Oxygen Demand (BOD), Nitrate Nitrogen (NO3N), Ammonia Nitrogen (NH3N) and Total Coliform (T-Coliform). The methodology involves applying data mining techniques using multilayer perceptron (MLP) neural network models. The data consisted of 11 sites of Saen Saep canal in Bangkok, Thailand. The data is obtained from the Department of Drainage and Sewerage Bangkok Metropolitan Administration during 2007-2011. The results of multilayer perceptron neural network exhibit a high accuracy multilayer perception rate at 94.23% in classifying the water quality of Saen Saep canal in Bangkok. Subsequently, this encouraging result could be combined with GIS data improves the classification accuracy significantly.Keywords: artificial neural network, geographic information system, water quality, computer science
Procedia PDF Downloads 3441040 Cone Contrast Sensitivity of Normal Trichromats and Those with Red-Green Dichromats
Authors: Tatsuya Iizuka, Takushi Kawamorita, Tomoya Handa, Hitoshi Ishikawa
Abstract:
We report normative cone contrast sensitivity values and sensitivity and specificity values for a computer-based color vision test, the cone contrast test-HD (CCT-HD). The participants included 50 phakic eyes with normal color vision (NCV) and 20 dichromatic eyes (ten with protanopia and ten with deuteranopia). The CCT-HD was used to measure L, M, and S-CCT-HD scores (color vision deficiency, L-, M-cone logCS≦1.65, S-cone logCS≦0.425) to investigate the sensitivity and specificity of CCT-HD based on anomalous-type diagnosis with animalscope. The mean ± standard error L-, M-, S-cone logCS for protanopia were 0.90±0.04, 1.65±0.03, and 0.63±0.02, respectively; for deuteranopia 1.74±0.03, 1.31±0.03, and 0.61±0.06, respectively; and for age-matched NCV were 1.89±0.04, 1.84±0.04, and 0.60±0.03, respectively, with significant differences for each group except for S-CCT-HD (Bonferroni corrected α = 0.0167, p < 0.0167). The sensitivity and specificity of CCT-HD were 100% for protan and deutan in diagnosing abnormal types from 20 to 64 years of age, but the specificity decreased to 65% for protan and 55% for deutan in older persons > 65. CCT-HD is comparable to the diagnostic performance of the anomalous type in the anomaloscope for the 20-64-year-old age group. However, the results should be interpreted cautiously in those ≥ 65 years. They are more susceptible to acquired color vision deficiencies due to the yellowing of the crystalline lens and other factors.Keywords: cone contrast test HD, color vision test, congenital color vision deficiency, red-green dichromacy, cone contrast sensitivity
Procedia PDF Downloads 1021039 Prime Mover Sizing for Base-Loaded Combined Heating and Power Systems
Authors: Djalal Boualili
Abstract:
This article considers the problem of sizing prime movers for combined heating and power (CHP) systems operating at full load to satisfy a fraction of a facility's electric load, i.e. a base load. Prime mover sizing is examined using three criteria: operational cost, carbon dioxide emissions (CDE), and primary energy consumption (PEC). The sizing process leads to consider ratios of conversion factors applied to imported electricity to conversion factors applied to fuel consumed. These ratios are labelled RCost, R CDE, R PEC depending on whether the conversion factors are associated with operational cost, CDE, or PEC, respectively. Analytical results show that in order to achieve savings in operational cost, CDE, or PEC, the ratios must be larger than a unique constant R Min that only depends on the CHP components efficiencies. Savings in operational cost, CDE, or PEC due to CHP operation are explicitly formulated using simple equations. This facilitates the process of comparing the tradeoffs of optimizing the savings of one criterion over the other two – a task that has traditionally been accomplished through computer simulations. A hospital building, located in Chlef, Algeria, was used as an example to apply the methodology presented in this article.Keywords: sizing, heating and power, ratios, energy consumption, carbon dioxide emissions
Procedia PDF Downloads 2321038 University Building: Discussion about the Effect of Numerical Modelling Assumptions for Occupant Behavior
Authors: Fabrizio Ascione, Martina Borrelli, Rosa Francesca De Masi, Silvia Ruggiero, Giuseppe Peter Vanoli
Abstract:
The refurbishment of public buildings is one of the key factors of energy efficiency policy of European States. Educational buildings account for the largest share of the oldest edifice with interesting potentialities for demonstrating best practice with regards to high performance and low and zero-carbon design and for becoming exemplar cases within the community. In this context, this paper discusses the critical issue of dealing the energy refurbishment of a university building in heating dominated climate of South Italy. More in detail, the importance of using validated models will be examined exhaustively by proposing an analysis on uncertainties due to modelling assumptions mainly referring to the adoption of stochastic schedules for occupant behavior and equipment or lighting usage. Indeed, today, the great part of commercial tools provides to designers a library of possible schedules with which thermal zones can be described. Very often, the users do not pay close attention to diversify thermal zones and to modify or to adapt predefined profiles, and results of designing are affected positively or negatively without any alarm about it. Data such as occupancy schedules, internal loads and the interaction between people and windows or plant systems, represent some of the largest variables during the energy modelling and to understand calibration results. This is mainly due to the adoption of discrete standardized and conventional schedules with important consequences on the prevision of the energy consumptions. The problem is surely difficult to examine and to solve. In this paper, a sensitivity analysis is presented, to understand what is the order of magnitude of error that is committed by varying the deterministic schedules used for occupation, internal load, and lighting system. This could be a typical uncertainty for a case study as the presented one where there is not a regulation system for the HVAC system thus the occupant cannot interact with it. More in detail, starting from adopted schedules, created according to questioner’ s responses and that has allowed a good calibration of energy simulation model, several different scenarios are tested. Two type of analysis are presented: the reference building is compared with these scenarios in term of percentage difference on the projected total electric energy need and natural gas request. Then the different entries of consumption are analyzed and for more interesting cases also the comparison between calibration indexes. Moreover, for the optimal refurbishment solution, the same simulations are done. The variation on the provision of energy saving and global cost reduction is evidenced. This parametric study wants to underline the effect on performance indexes evaluation of the modelling assumptions during the description of thermal zones.Keywords: energy simulation, modelling calibration, occupant behavior, university building
Procedia PDF Downloads 1411037 The Development of XML Resume System in Thailand
Authors: Jarumon Nookhong, Thanakorn Uiphanit
Abstract:
This study is a research and development project which aims to develop XML Resume System to be the standard system in Thailand as well as to measure the efficiency of the XML Resume System in Thailand. This research separates into 2 stages: 1) to develop XML Document System to be the standard in Thailand, and 2) to experiment the system performance. The sample in this research is committed by 50 specialists in the field of human resources by selecting specifically. The tool that uses in this research is XML Resume System in Thailand and the performance evaluation format of system while the analysis of the data is calculated by using average and standard deviation. The result of the research found that the development of the XML Resume System that aims to be the standard system in Thailand had the result 2.67 of the average which is in a good level. The evaluation in testing the performance of the system had been done by the specialists of human resources who use the XML Resume system. When analyzing each part, it found out that the abilities according to the user’s requirement from specialists in the field of human resources, the convenience and easiness in usages, and the functional competency are respectively in a good level. The average of the ability according to the user’s need from specialists of human resources is 2.92. The average of the convenience and easiness in usages is 2.56. The average of functional competency is 2.53. These can be used as the standard effectively.Keywords: resume, XML, XML schema, computer science
Procedia PDF Downloads 4101036 Numerical Solution of Transient Natural Convection in Vertical Heated Rectangular Channel between Two Vertical Parallel MTR-Type Fuel Plates
Authors: Djalal Hamed
Abstract:
The aim of this paper is to perform, by mean of the finite volume method, a numerical solution of the transient natural convection in a narrow rectangular channel between two vertical parallel Material Testing Reactor (MTR)-type fuel plates, imposed under a heat flux with a cosine shape to determine the margin of the nuclear core power at which the natural convection cooling mode can ensure a safe core cooling, where the cladding temperature should not reach a specific safety limits (90 °C). For this purpose, a computer program is developed to determine the principal parameters related to the nuclear core safety, such as the temperature distribution in the fuel plate and in the coolant (light water) as a function of the reactor core power. Throughout the obtained results, we noticed that the core power should not reach 400 kW, to ensure a safe passive residual heat removing from the nuclear core by the upward natural convection cooling mode.Keywords: buoyancy force, friction force, finite volume method, transient natural convection
Procedia PDF Downloads 1971035 Construction of QSAR Models to Predict Potency on a Series of substituted Imidazole Derivatives as Anti-fungal Agents
Authors: Sara El Mansouria Beghdadi
Abstract:
Quantitative structure–activity relationship (QSAR) modelling is one of the main computer tools used in medicinal chemistry. Over the past two decades, the incidence of fungal infections has increased due to the development of resistance. In this study, the QSAR was performed on a series of esters of 2-carboxamido-3-(1H-imidazole-1-yl) propanoic acid derivatives. These compounds have showed moderate and very good antifungal activity. The multiple linear regression (MLR) was used to generate the linear 2d-QSAR models. The dataset consists of 115 compounds with their antifungal activity (log MIC) against «Candida albicans» (ATCC SC5314). Descriptors were calculated, and different models were generated using Chemoffice, Avogadro, GaussView software. The selected model was validated. The study suggests that the increase in lipophilicity and the reduction in the electronic character of the substituent in R1, as well as the reduction in the steric hindrance of the substituent in R2 and its aromatic character, supporting the potentiation of the antifungal effect. The results of QSAR could help scientists to propose new compounds with higher antifungal activities intended for immunocompromised patients susceptible to multi-resistant nosocomial infections.Keywords: quantitative structure–activity relationship, imidazole, antifungal, candida albicans (ATCC SC5314)
Procedia PDF Downloads 861034 Model Canvas and Process for Educational Game Design in Outcome-Based Education
Authors: Ratima Damkham, Natasha Dejdumrong, Priyakorn Pusawiro
Abstract:
This paper explored the solution in game design to help game designers in the educational game designing using digital educational game model canvas (DEGMC) and digital educational game form (DEGF) based on Outcome-based Education program. DEGMC and DEGF can help designers develop an overview of the game while designing and planning their own game. The way to clearly assess players’ ability from learning outcomes and support their game learning design is by using the tools. Designers can balance educational content and entertainment in designing a game by using the strategies of the Business Model Canvas and design the gameplay and players’ ability assessment from learning outcomes they need by referring to the Constructive Alignment. Furthermore, they can use their design plan in this research to write their Game Design Document (GDD). The success of the research was evaluated by four experts’ perspectives in the education and computer field. From the experiments, the canvas and form helped the game designers model their game according to the learning outcomes and analysis of their own game elements. This method can be a path to research an educational game design in the future.Keywords: constructive alignment, constructivist theory, educational game, outcome-based education
Procedia PDF Downloads 3551033 Minimum Vertices Dominating Set Algorithm for Secret Sharing Scheme
Authors: N. M. G. Al-Saidi, K. A. Kadhim, N. A. Rajab
Abstract:
Over the past decades, computer networks and data communication system has been developing fast, so, the necessity to protect a transmitted data is a challenging issue, and data security becomes a serious problem nowadays. A secret sharing scheme is a method which allows a master key to be distributed among a finite set of participants, in such a way that only certain authorized subsets of participants to reconstruct the original master key. To create a secret sharing scheme, many mathematical structures have been used; the most widely used structure is the one that is based on graph theory (graph access structure). Subsequently, many researchers tried to find efficient schemes based on graph access structures. In this paper, we propose a novel efficient construction of a perfect secret sharing scheme for uniform access structure. The dominating set of vertices in a regular graph is used for this construction in the following way; each vertex represents a participant and each minimum independent dominating subset represents a minimal qualified subset. Some relations between dominating set, graph order and regularity are achieved, and can be used to demonstrate the possibility of using dominating set to construct a secret sharing scheme. The information rate that is used as a measure for the efficiency of such systems is calculated to show that the proposed method has some improved values.Keywords: secret sharing scheme, dominating set, information rate, access structure, rank
Procedia PDF Downloads 3941032 The Cost of Beauty: Insecurity and Profit
Authors: D. Cole, S. Mahootian, P. Medlock
Abstract:
This research contributes to existing knowledge of the complexities surrounding women’s relationship to beauty standards by examining their lived experiences. While there is much academic work on the effects of culturally imposed and largely unattainable beauty standards, the arguments tend to fall into two paradigms. On the one hand is the radical feminist perspective that argues that women are subjected to absolute oppression within the patriarchal system in which beauty standards have been constructed. This position advocates for a complete restructuring of social institutions to liberate women from all types of oppression. On the other hand, there are liberal feminist arguments that focus on choice, arguing that women’s agency in how to present themselves is empowerment. These arguments center around what women do within the patriarchal system in order to liberate themselves. However, there is very little research on the lived experiences of women negotiating these two realms: the complex negotiation between the pressure to adhere to cultural beauty standards and the agency of self-expression and empowerment. By exploring beauty standards through the intersection of societal messages (including macro-level processes such as social media and advertising as well as smaller-scale interactions such as families and peers) and lived experiences, this study seeks to provide a nuanced understanding of how women navigate and negotiate their own presentation and sense of self-identity. Current research sees a rise in incidents of body dysmorphia, depression and anxiety since the advent of social media. Approximately 91% of women are unhappy with their bodies and resort to dieting to achieve their ideal body shape, but only 5% of women naturally possess the body type often portrayed by Americans in movies and media. It is, therefore, crucial we begin talking about the processes that are affecting self-image and mental health. A question that arises is that, given these negative effects, why do companies continue to advertise and target women with standards that very few could possibly attain? One obvious answer is that keeping beauty standards largely unattainable enables the beauty and fashion industries to make large profits by promising products and procedures that will bring one up to “standard”. The creation of dissatisfaction for some is profit for others. This research utilizes qualitative methods: interviews, questionnaires, and focus groups to investigate women’s relationships to beauty standards and empowerment. To this end, we reached out to potential participants through a video campaign on social media: short clips on Instagram, Facebook, and TikTok and a longer clip on YouTube inviting users to take part in the study. Participants are asked to react to images, videos, and other beauty-related texts. The findings of this research have implications for policy development, advocacy and interventions aimed at promoting healthy inclusivity and empowerment of women.Keywords: women, beauty, consumerism, social media
Procedia PDF Downloads 641031 Evaluation of Condyle Alterations after Orthognathic Surgery with a Digital Image Processing Technique
Authors: Livia Eisler, Cristiane C. B. Alves, Cristina L. F. Ortolani, Kurt Faltin Jr.
Abstract:
Purpose: This paper proposes a technically simple diagnosis method among orthodontists and maxillofacial surgeons in order to evaluate discrete bone alterations. The methodology consists of a protocol to optimize the diagnosis and minimize the possibility for orthodontic and ortho-surgical retreatment. Materials and Methods: A protocol of image processing and analysis, through ImageJ software and its plugins, was applied to 20 pairs of lateral cephalometric images obtained from cone beam computerized tomographies, before and 1 year after undergoing orthognathic surgery. The optical density of the images was analyzed in the condylar region to determine possible bone alteration after surgical correction. Results: Image density was shown to be altered in all image pairs, especially regarding the condyle contours. According to measures, condyle had a gender-related density reduction for p=0.05 and condylar contours had their alterations registered in mm. Conclusion: A simple, viable and cost-effective technique can be applied to achieve the more detailed image-based diagnosis, not depending on the human eye and therefore, offering more reliable, quantitative results.Keywords: bone resorption, computer-assisted image processing, orthodontics, orthognathic surgery
Procedia PDF Downloads 1611030 Document Analysis for Modelling iTV Advertising towards Impulse Purchase
Authors: Azizah Che Omar
Abstract:
The study provides a systematic literature review which analyzed the literature for the purpose of looking for concepts, theories, approaches and guidelines in order to propose a conceptual design model of interactive television advertising toward impulse purchase (iTVAdIP). An extensive review of literature was purposely carried out to understand the concepts of interactive television (iTV). Therefore, some elements; iTV guidelines, advertising theories, persuasive approaches, and the impulse purchase elements were analyzed to reach the scope of this work. The extensive review was also a necessity to achieve the objective of this study, which was to determine the concept of iTVAdIP design model. Through systematic review analysis, this study discovered that all the previous models did not emphasize the conceptual design model of interactive television advertising. As a result, the finding showed that the concept of the proposed model should contain the iTV guidelines, advertising theory, persuasive approach and impulse purchase elements. In addition, a summary diagram for the development of the proposed model is depicted to provide clearer understanding towards the concepts of conceptual design model of iTVAdIP.Keywords: impulse purchase, interactive television advertising, human computer interaction, advertising theories
Procedia PDF Downloads 3721029 Attention Based Fully Convolutional Neural Network for Simultaneous Detection and Segmentation of Optic Disc in Retinal Fundus Images
Authors: Sandip Sadhukhan, Arpita Sarkar, Debprasad Sinha, Goutam Kumar Ghorai, Gautam Sarkar, Ashis K. Dhara
Abstract:
Accurate segmentation of the optic disc is very important for computer-aided diagnosis of several ocular diseases such as glaucoma, diabetic retinopathy, and hypertensive retinopathy. The paper presents an accurate and fast optic disc detection and segmentation method using an attention based fully convolutional network. The network is trained from scratch using the fundus images of extended MESSIDOR database and the trained model is used for segmentation of optic disc. The false positives are removed based on morphological operation and shape features. The result is evaluated using three-fold cross-validation on six public fundus image databases such as DIARETDB0, DIARETDB1, DRIVE, AV-INSPIRE, CHASE DB1 and MESSIDOR. The attention based fully convolutional network is robust and effective for detection and segmentation of optic disc in the images affected by diabetic retinopathy and it outperforms existing techniques.Keywords: attention-based fully convolutional network, optic disc detection and segmentation, retinal fundus image, screening of ocular diseases
Procedia PDF Downloads 1431028 The Inclusive Human Trafficking Checklist: A Dialectical Measurement Methodology
Authors: Maria C. Almario, Pam Remer, Jeff Resse, Kathy Moran, Linda Theander Adam
Abstract:
The identification of victims of human trafficking and consequential service provision is characterized by a significant disconnection between the estimated prevalence of this issue and the number of cases identified. This poses as tremendous problem for human rights advocates as it prevents data collection, information sharing, allocation of resources and opportunities for international dialogues. The current paper introduces the Inclusive Human Trafficking Checklist (IHTC) as a measurement methodology with theoretical underpinnings derived from dialectic theory. The presence of human trafficking in a person’s life is conceptualized as a dynamic and dialectic interaction between vulnerability and exploitation. The current papers explores the operationalization of exploitation and vulnerability, evaluates the metric qualities of the instrument, evaluates whether there are differences in assessment based on the participant’s profession, level of knowledge, and training, and assesses if users of the instrument perceive it as useful. A total of 201 participants were asked to rate three vignettes predetermined by experts to qualify as a either human trafficking case or not. The participants were placed in three conditions: business as usual, utilization of the IHTC with and without training. The results revealed a statistically significant level of agreement between the expert’s diagnostic and the application of the IHTC with an improvement of 40% on identification when compared with the business as usual condition While there was an improvement in identification in the group with training, the difference was found to have a small effect size. Participants who utilized the IHTC showed an increased ability to identify elements of identity-based vulnerabilities as well as elements of fraud, which according to the results, are distinctive variables in cases of human trafficking. In terms of the perceived utility, the results revealed higher mean scores for the groups utilizing the IHTC when compared to the business as usual condition. These findings suggest that the IHTC improves appropriate identification of cases and that it is perceived as a useful instrument. The application of the IHTC as a multidisciplinary instrumentation that can be utilized in legal and human services settings is discussed as a pivotal piece of helping victims restore their sense of dignity, and advocate for legal, physical and psychological reparations. It is noteworthy that this study was conducted with a sample in the United States and later re-tested in Colombia. The implications of the instrument for treatment conceptualization and intervention in human trafficking cases are discussed as opportunities for enhancement of victim well-being, restoration engagement and activism. With the idea that what is personal is also political, we believe that the careful observation and data collection in specific cases can inform new areas of human rights activism.Keywords: exploitation, human trafficking, measurement, vulnerability, screening
Procedia PDF Downloads 3311027 Leakage Current Analysis of FinFET Based 7T SRAM at 32nm Technology
Authors: Chhavi Saxena
Abstract:
FinFETs can be a replacement for bulk-CMOS transistors in many different designs. Its low leakage/standby power property makes FinFETs a desirable option for memory sub-systems. Memory modules are widely used in most digital and computer systems. Leakage power is very important in memory cells since most memory applications access only one or very few memory rows at a given time. As technology scales down, the importance of leakage current and power analysis for memory design is increasing. In this paper, we discover an option for low power interconnect synthesis at the 32nm node and beyond, using Fin-type Field-Effect Transistors (FinFETs) which are a promising substitute for bulk CMOS at the considered gate lengths. We consider a mechanism for improving FinFETs efficiency, called variable supply voltage schemes. In this paper, we’ve illustrated the design and implementation of FinFET based 4x4 SRAM cell array by means of one bit 7T SRAM. FinFET based 7T SRAM has been designed and analysis have been carried out for leakage current, dynamic power and delay. For the validation of our design approach, the output of FinFET SRAM array have been compared with standard CMOS SRAM and significant improvements are obtained in proposed model.Keywords: FinFET, 7T SRAM cell, leakage current, delay
Procedia PDF Downloads 4551026 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 301025 In Response to Worldwide Disaster: Academic Libraries’ Functioning During COVID-19 Pandemic Without a Policy
Authors: Dalal Albudaiwi, Mike Allen, Talal Alhaji, Shahnaz Khadimehzadah
Abstract:
As a pandemic, COVID-19 has impacted the whole world since November 2019. In other words, every organization, industry, and institution has been negatively affected by the Coronavirus. The uncertainty of how long the pandemic will last caused chaos at all levels. As with any other institution, public libraries were affected and transmitted into online services and resources. As internationally, have been witnessed that some public libraries were well-prepared for such disasters as the pandemic, and therefore, collections, users, services, technologies, staff, and budgets were all influenced. Public libraries’ policies did not mention any plan regarding such a pandemic. Instead, there are several rules in the guidelines about disasters in general, such as natural disasters. In this pandemic situation, libraries have been involved in different uneasy circumstances. However, it has always been apparent to public libraries the role they play in serving their communities in excellent and critical times. It dwells into the traditional role public libraries play in providing information services and sources to satisfy their information-based community needs. Remarkably increasing people’s awareness of the importance of informational enrichment and enhancing society’s skills in dealing with information and information sources. Under critical circumstances, libraries play a different role. It goes beyond the traditional part of information providers to the untraditional role of being a social institution that serves the community with whatever capabilities they have. This study takes two significant directions. The first focuses on investigating how libraries have responded to COVID-19 and how they manage disasters within their organization. The second direction focuses on how libraries help their communities to act during disasters and how to recover from the consequences. The current study examines how libraries prepare for disasters and the role of public libraries during disasters. We will also propose “measures” to be a model that libraries can use to evaluate the effectiveness of their response to disasters. We intend to focus on how libraries responded to this new disaster. Therefore, this study aims to develop a comprehensive policy that includes responding to a crisis such as Covid-19. An analytical lens inside the libraries as an organization and outside the organization walls will be documented based on analyzing disaster-related literature published in the LIS publication. The study employs content analysis (CA) methodology. CA is widely used in the library and information science. The critical contribution of this work is to propose solutions it provides to libraries and planers to prepare crisis management plans/ policies, specifically to face a new global disaster such as the COVID-19 pandemic. Moreover, the study will help library directors to evaluate their strategies and to improve them properly. The significance of this study lies in guiding libraries’ directors to enhance the goals of the libraries to guarantee crucial issues such as: saving time, avoiding loss, saving budget, acting quickly during a crisis, maintaining libraries’ role during pandemics, finding out the best response to disasters, and creating plan/policy as a sample for all libraries.Keywords: Covid-19, policy, preparedness, public libraries
Procedia PDF Downloads 831024 Calculation of Orbital Elements for Sending Interplanetary Probes
Authors: Jorge Lus Nisperuza Toledo, Juan Pablo Rubio Ospina, Daniel Santiago Umana, Hector Alejandro Alvarez
Abstract:
This work develops and implements computational codes to calculate the optimal launch trajectories for sending a probe from the earth to different planets of the Solar system, making use of trajectories of the Hohmann and No-Hohmann type and gravitational assistance in intermediate steps. Specifically, the orbital elements, the graphs and the dynamic simulations of the trajectories for sending a probe from the Earth towards the planets Mercury, Venus, Mars, Jupiter, and Saturn are obtained. A detailed study was made of the state vectors of the position and orbital velocity of the considered planets in order to determine the optimal trajectories of the probe. For this purpose, computer codes were developed and implemented to obtain the orbital elements of the Mariner 10 (Mercury), Magellan (Venus), Mars Global Surveyor (Mars) and Voyager 1 (Jupiter and Saturn) missions, as an exercise in corroborating the algorithms. This exercise gives validity to computational codes, allowing to find the orbital elements and the simulations of trajectories of three future interplanetary missions with specific launch windows.Keywords: gravitational assistance, Hohmann’s trajectories, interplanetary mission, orbital elements
Procedia PDF Downloads 1831023 Analysis of Genomics Big Data in Cloud Computing Using Fuzzy Logic
Authors: Mohammad Vahed, Ana Sadeghitohidi, Majid Vahed, Hiroki Takahashi
Abstract:
In the genomics field, the huge amounts of data have produced by the next-generation sequencers (NGS). Data volumes are very rapidly growing, as it is postulated that more than one billion bases will be produced per year in 2020. The growth rate of produced data is much faster than Moore's law in computer technology. This makes it more difficult to deal with genomics data, such as storing data, searching information, and finding the hidden information. It is required to develop the analysis platform for genomics big data. Cloud computing newly developed enables us to deal with big data more efficiently. Hadoop is one of the frameworks distributed computing and relies upon the core of a Big Data as a Service (BDaaS). Although many services have adopted this technology, e.g. amazon, there are a few applications in the biology field. Here, we propose a new algorithm to more efficiently deal with the genomics big data, e.g. sequencing data. Our algorithm consists of two parts: First is that BDaaS is applied for handling the data more efficiently. Second is that the hybrid method of MapReduce and Fuzzy logic is applied for data processing. This step can be parallelized in implementation. Our algorithm has great potential in computational analysis of genomics big data, e.g. de novo genome assembly and sequence similarity search. We will discuss our algorithm and its feasibility.Keywords: big data, fuzzy logic, MapReduce, Hadoop, cloud computing
Procedia PDF Downloads 3001022 An Empirical Study on Switching Activation Functions in Shallow and Deep Neural Networks
Authors: Apoorva Vinod, Archana Mathur, Snehanshu Saha
Abstract:
Though there exists a plethora of Activation Functions (AFs) used in single and multiple hidden layer Neural Networks (NN), their behavior always raised curiosity, whether used in combination or singly. The popular AFs –Sigmoid, ReLU, and Tanh–have performed prominently well for shallow and deep architectures. Most of the time, AFs are used singly in multi-layered NN, and, to the best of our knowledge, their performance is never studied and analyzed deeply when used in combination. In this manuscript, we experiment with multi-layered NN architecture (both on shallow and deep architectures; Convolutional NN and VGG16) and investigate how well the network responds to using two different AFs (Sigmoid-Tanh, Tanh-ReLU, ReLU-Sigmoid) used alternately against a traditional, single (Sigmoid-Sigmoid, Tanh-Tanh, ReLUReLU) combination. Our results show that using two different AFs, the network achieves better accuracy, substantially lower loss, and faster convergence on 4 computer vision (CV) and 15 Non-CV (NCV) datasets. When using different AFs, not only was the accuracy greater by 6-7%, but we also accomplished convergence twice as fast. We present a case study to investigate the probability of networks suffering vanishing and exploding gradients when using two different AFs. Additionally, we theoretically showed that a composition of two or more AFs satisfies Universal Approximation Theorem (UAT).Keywords: activation function, universal approximation function, neural networks, convergence
Procedia PDF Downloads 1601021 Modeling of Hydraulic Networking of Water Supply Subsystem Case of Addis Ababa
Authors: Solomon Weldegebriel Gebrelibanos
Abstract:
Water is one of the most important substances in human life that can give a human liberality with its cost and availability. Water comes from rainfall and runoff and reaches the ground as runoff that is stored in a river, ponds, and big water bodies, including sea and ocean and the remaining water portion is infiltrated into the ground to store in the aquifer. Water can serve human beings in various ways, including irrigation, water supply, hydropower and soon. Water supply is the main pillar of the water service to the human being. Water supply distribution in Addis Ababa arises from Legedadi, Akakai, and Gefersa. The objective of the study is to measure the performance of the water supply distribution in Addis Ababa city. The water supply distribution model is developed by computer-aided design software. The model can analyze the operational change, loss of water, and performance of the network. The two design criteria that have been employed to analyze the network system are velocity and pressure. The result shows that the customers are using the water at high pressure with low demand. The water distribution system is older than the expected service life with more leakage. Hence the study recommended that fixing Pressure valves and new distribution facilities can resolve the performance of the water supply systemKeywords: distribution, model, pressure, velocity
Procedia PDF Downloads 1381020 Analysis of Stress and Strain in Head Based Control of Cooperative Robots through Tetraplegics
Authors: Jochen Nelles, Susanne Kohns, Julia Spies, Friederike Schmitz-Buhl, Roland Thietje, Christopher Brandl, Alexander Mertens, Christopher M. Schlick
Abstract:
Industrial robots as part of highly automated manufacturing are recently developed to cooperative (light-weight) robots. This offers the opportunity of using them as assistance robots and to improve the participation in professional life of disabled or handicapped people such as tetraplegics. Robots under development are located within a cooperation area together with the working person at the same workplace. This cooperation area is an area where the robot and the working person can perform tasks at the same time. Thus, working people and robots are operating in the immediate proximity. Considering the physical restrictions and the limited mobility of tetraplegics, a hands-free robot control could be an appropriate approach for a cooperative assistance robot. To meet these requirements, the research project MeRoSy (human-robot synergy) develops methods for cooperative assistance robots based on the measurement of head movements of the working person. One research objective is to improve the participation in professional life of people with disabilities and, in particular, mobility impaired persons (e.g. wheelchair users or tetraplegics), whose participation in a self-determined working life is denied. This raises the research question, how a human-robot cooperation workplace can be designed for hands-free robot control. Here, the example of a library scenario is demonstrated. In this paper, an empirical study that focuses on the impact of head movement related stress is presented. 12 test subjects with tetraplegia participated in the study. Tetraplegia also known as quadriplegia is the worst type of spinal cord injury. In the experiment, three various basic head movements were examined. Data of the head posture were collected by a motion capture system; muscle activity was measured via surface electromyography and the subjective mental stress was assessed via a mental effort questionnaire. The muscle activity was measured for the sternocleidomastoid (SCM), the upper trapezius (UT) or trapezius pars descendens, and the splenius capitis (SPL) muscle. For this purpose, six non-invasive surface electromyography sensors were mounted on the head and neck area. An analysis of variance shows differentiated muscular strains depending on the type of head movement. Systematically investigating the influence of different basic head movements on the resulting strain is an important issue to relate the research results to other scenarios. At the end of this paper, a conclusion will be drawn and an outlook of future work will be presented.Keywords: assistance robot, human-robot interaction, motion capture, stress-strain-concept, surface electromyography, tetraplegia
Procedia PDF Downloads 3161019 The Interplay between Technology and Culture in Inbound Call Center Industry
Authors: Joseph Reylan Viray, Kriztine R. Viray
Abstract:
Call center conversations, more than the business dimensions that they normally manifest, are interactions between human beings. These are communication exchanges that are packed with psychological, cultural and social dimensions that affect the specific experience of the parties. The increasing development of information and communication technology over the past decades brought about important advantages and corresponding disadvantages in the process of communicational transactions in call center industry. It has been established that the technology is so powerful that it strongly affects, among others, call center business. In the present study, the author explores the interplay between the technology being utilized by the industry and the cultural orientations of both the call center agents and their customers in the process of communication exchanges. Specifically, the paper seeks to (1) describe the interplay between culture and technology in inbound call center industry as it affects the communication exchange of the agents and customers; (2) understand the nature and the dynamics of the call center industry as regards the cultural dimensions of Hofstede; and (3) come up with a simple study where the cross-cultural aspect of the call center industry could be highlighted and could provide necessary knowledge to the stakeholders. Cognizant of the complexity of the topic, the researchers employed Hofstede's cultural dimensions. Likewise, another theory that was used in this study is the Computer Mediated Communication Theory.Keywords: call center industry, culture, Hofstede, CMT, technology
Procedia PDF Downloads 3541018 Design of Speed Bump Recognition System Integrated with Adjustable Shock Absorber Control
Authors: Ming-Yen Chang, Sheng-Hung Ke
Abstract:
This research focuses on the development of a speed bump identification system for real-time control of adjustable shock absorbers in vehicular suspension systems. The study initially involved the collection of images of various speed bumps, and rubber speed bump profiles found on roadways. These images were utilized for training and recognition purposes through the deep learning object detection algorithm YOLOv5. Subsequently, the trained speed bump identification program was integrated with an in-vehicle camera system for live image capture during driving. These images were instantly transmitted to a computer for processing. Using the principles of monocular vision ranging, the distance between the vehicle and an approaching speed bump was determined. The appropriate control distance was established through both practical vehicle measurements and theoretical calculations. Collaboratively, with the electronically adjustable shock absorbers equipped in the vehicle, a shock absorber control system was devised to dynamically adapt the damping force just prior to encountering a speed bump. This system effectively mitigates passenger discomfort and enhances ride quality.Keywords: adjustable shock absorbers, image recognition, monocular vision ranging, ride
Procedia PDF Downloads 671017 In vitro Effects of Viscum album on the Functionality of Rabbit Spermatozoa
Authors: Marek Halenár, Eva Tvrdá, Simona Baldovská, Ľubomír Ondruška, Peter Massányi, Adriana Kolesárová
Abstract:
This study aimed to assess the in vitro effects of different concentrations of the Viscum album extract on the motility, viability, and reactive oxygen species (ROS) production by rabbit spermatozoa during different time periods (0, 2, and 8h). Spermatozoa motility was assessed by using the CASA (Computer aided sperm analysis) system. Cell viability was evaluated by using the metabolic activity MTT assay, and the luminol-based luminometry was applied to quantify the ROS formation. The CASA analysis revealed that low Viscum concentrations were able to prevent a rapid decline of spermatozoa motility, especially in the case of concentrations ranging between 1 and 5 µg/mL (P<0.05 with respect to time 8h). At the same time, concentrations ranging between 1 and 100 µg/mL of the extract led to a significant preservation of the cell viability (P<0.05 in case of 5, 50 and 100 µg/mL; P<0.01 with respect to 1 and 10 µg/mL, time 8h). 1 and 5 µg/mL of the extract exhibited antioxidant characteristics, translated into a significant reduction of the ROS production, particularly notable at time 8h (P<0.01). The results indicate that the Viscum extract is capable of delaying the damage inflicted to the spermatozoon by the in vitro environment.Keywords: CASA, mistletoe, mitochondrial activity, motility, reactive oxygen species, rabbits, spermatozoa, Viscum album
Procedia PDF Downloads 3941016 Offline Signature Verification in Punjabi Based On SURF Features and Critical Point Matching Using HMM
Authors: Rajpal Kaur, Pooja Choudhary
Abstract:
Biometrics, which refers to identifying an individual based on his or her physiological or behavioral characteristics, has the capabilities to the reliably distinguish between an authorized person and an imposter. The Signature recognition systems can categorized as offline (static) and online (dynamic). This paper presents Surf Feature based recognition of offline signatures system that is trained with low-resolution scanned signature images. The signature of a person is an important biometric attribute of a human being which can be used to authenticate human identity. However the signatures of human can be handled as an image and recognized using computer vision and HMM techniques. With modern computers, there is need to develop fast algorithms for signature recognition. There are multiple techniques are defined to signature recognition with a lot of scope of research. In this paper, (static signature) off-line signature recognition & verification using surf feature with HMM is proposed, where the signature is captured and presented to the user in an image format. Signatures are verified depended on parameters extracted from the signature using various image processing techniques. The Off-line Signature Verification and Recognition is implemented using Mat lab platform. This work has been analyzed or tested and found suitable for its purpose or result. The proposed method performs better than the other recently proposed methods.Keywords: offline signature verification, offline signature recognition, signatures, SURF features, HMM
Procedia PDF Downloads 3851015 Determining Fire Resistance of Wooden Construction Elements through Experimental Studies and Artificial Neural Network
Authors: Sakir Tasdemir, Mustafa Altin, Gamze Fahriye Pehlivan, Sadiye Didem Boztepe Erkis, Ismail Saritas, Selma Tasdemir
Abstract:
Artificial intelligence applications are commonly used in industry in many fields in parallel with the developments in the computer technology. In this study, a fire room was prepared for the resistance of wooden construction elements and with the mechanism here, the experiments of polished materials were carried out. By utilizing from the experimental data, an artificial neural network (ANN) was modeled in order to evaluate the final cross sections of the wooden samples remaining from the fire. In modelling, experimental data obtained from the fire room were used. In the system developed, the first weight of samples (ws-gr), preliminary cross-section (pcs-mm2), fire time (ft-minute), fire temperature (t-oC) as input parameters and final cross-section (fcs-mm2) as output parameter were taken. When the results obtained from ANN and experimental data are compared after making statistical analyses, the data of two groups are determined to be coherent and seen to have no meaning difference between them. As a result, it is seen that ANN can be safely used in determining cross sections of wooden materials after fire and it prevents many disadvantages.Keywords: artificial neural network, final cross-section, fire retardant polishes, fire safety, wood resistance.
Procedia PDF Downloads 3851014 Human-Computer Interaction Pluriversal Framework for Ancestral Medicine App in Bogota: Asset-Based Design Case Study
Authors: Laura Niño Cáceres, Daisy Yoo, Caroline Hummels
Abstract:
COVID-19 accelerated digital healthcare technology usage in many countries, such as Colombia, whose digital healthcare vision and projects are proof of this. However, with a significant cultural indigenous and Afro-Colombian heritage, only some parts of the country are willing to follow the proposed digital Western approach to health. Our paper presents the national healthcare system’s digital narrative, which we contrast with the micro-narrative of an Afro-Colombian ethnomedicine unit in Bogota called Kilombo Yumma. This ethnomedical unit is building its mobile app to safeguard and represent its ancestral medicine practices in local and national healthcare information systems. Kilombo Yumma is keen on promoting their beliefs and practices, which have been passed on through oral traditions and currently exist in the hands of a few older women. We unraveled their ambition, core beliefs, and practices through asset-based design. These assets outlined pluriversal and decolonizing forms of digital healthcare to increase social justice and connect Western and ancestral medicine digital opportunities through HCI.Keywords: asset-based design, mobile app, decolonizing HCI, Afro-Colombian ancestral medicine
Procedia PDF Downloads 80