Search results for: OghmaNano software
1576 Voices and Cries Across the Generations: British Bangladeshis’ Responses to Combat and Resist Stigmatisation
Authors: Mubassir Hussain
Abstract:
British Bangladeshis are one of the most marginalised and lowest socioeconomic groups in the UK. Their silent hardships have crystallised the stigma surrounding them. Understanding the intimate workings of this stigma can unravel its profound psychological impact, which has hindered their social and economic progress and slowly taken them out of the ‘victim’ mindset. Although community-based studies have been conducted to understand the nuances of British Bangladeshis’ stigma experiences, these examinations are broad and touch only the surface. They do not investigate the private family dynamics behind closed doors, how family members feel and engage with stigma or the use and justification of their responses. The main objectives of this qualitative research are to observe how attitudes towards stigma differ across generations, analyse the variety and frequency of reactions across age cohorts, gender, and social class, and examine how their actions and identities shape their responses. This data will be collected through embedded ethnography and analysed using qualitative software. Through this analysis, the research hypothesises that the older generation will engage more in the ‘isolation’, ‘not responding’, and ‘management of the self’ response categories, whereas the subsequent generations will employ ‘confrontation’, ‘demonstrating competence’, and ‘management of the self’ reactions. From these findings, the study anticipates an assortment of compelling and diverse responses and effects. Ultimately, the family members and community are responsible for their own futures and have the power to make these life-altering changes. Their collective experiences and values shape how individuals perceive and respond to stigma, racism, Islamophobia and discrimination through either silence, resilience or confrontation.Keywords: British Bangladeshi, stigma, racism, discrimination, Islamophobia, responses
Procedia PDF Downloads 151575 Symmetric Key Encryption Algorithm Using Indian Traditional Musical Scale for Information Security
Authors: Aishwarya Talapuru, Sri Silpa Padmanabhuni, B. Jyoshna
Abstract:
Cryptography helps in preventing threats to information security by providing various algorithms. This study introduces a new symmetric key encryption algorithm for information security which is linked with the "raagas" which means Indian traditional scale and pattern of music notes. This algorithm takes the plain text as input and starts its encryption process. The algorithm then randomly selects a raaga from the list of raagas that is assumed to be present with both sender and the receiver. The plain text is associated with the thus selected raaga and an intermediate cipher-text is formed as the algorithm converts the plain text characters into other characters, depending upon the rules of the algorithm. This intermediate code or cipher text is arranged in various patterns in three different rounds of encryption performed. The total number of rounds in the algorithm is equal to the multiples of 3. To be more specific, the outcome or output of the sequence of first three rounds is again passed as the input to this sequence of rounds recursively, till the total number of rounds of encryption is performed. The raaga selected by the algorithm and the number of rounds performed will be specified at an arbitrary location in the key, in addition to important information regarding the rounds of encryption, embedded in the key which is known by the sender and interpreted only by the receiver, thereby making the algorithm hack proof. The key can be constructed of any number of bits without any restriction to the size. A software application is also developed to demonstrate this process of encryption, which dynamically takes the plain text as input and readily generates the cipher text as output. Therefore, this algorithm stands as one of the strongest tools for information security.Keywords: cipher text, cryptography, plaintext, raaga
Procedia PDF Downloads 2881574 Influence of Foundation Size on Seismic Response of Mid-rise Buildings Considering Soil-Structure-Interaction
Authors: Quoc Van Nguyen, Behzad Fatahi, Aslan S. Hokmabadi
Abstract:
Performance based seismic design is a modern approach to earthquake-resistant design shifting emphasis from “strength” to “performance”. Soil-Structure Interaction (SSI) can influence the performance level of structures significantly. In this paper, a fifteen storey moment resisting frame sitting on a shallow foundation (footing) with different sizes is simulated numerically using ABAQUS software. The developed three dimensional numerical simulation accounts for nonlinear behaviour of the soil medium by considering the variation of soil stiffness and damping as a function of developed shear strain in the soil elements during earthquake. Elastic-perfectly plastic model is adopted to simulate piles and structural elements. Quiet boundary conditions are assigned to the numerical model and appropriate interface elements, capable of modelling sliding and separation between the foundation and soil elements, are considered. Numerical results in terms of base shear, lateral deformations, and inter-storey drifts of the structure are compared for the cases of soil-structure interaction system with different foundation sizes as well as fixed base condition (excluding SSI). It can be concluded that conventional design procedures excluding SSI may result in aggressive design. Moreover, the size of the foundation can influence the dynamic characteristics and seismic response of the building due to SSI and should therefore be given careful consideration in order to ensure a safe and cost effective seismic design.Keywords: soil-structure-interaction, seismic response, shallow foundation, abaqus, rayleigh damping
Procedia PDF Downloads 5051573 Exploring the Factors Affecting the Presence of Farmers’ Markets in Rural British Columbia
Authors: Amirmohsen Behjat, Aleck Ostry, Christina Miewald, Bernie Pauly
Abstract:
Farmers’ Markets have become one of the important healthy food suppliers in both rural communities and urban settings. Farmers’ markets are evolving and their number has rapidly increased in the past decade. Despite this drastic increase, the distribution of the farmers’ markets is not even across different areas. The main goal of this study is to explore the socioeconomic, geographic, and demographic variables which affect the establishment of farmers’ market in rural communities in British Columbia (BC). Thus, the data on available farmers’ markets in rural areas were collected from BC Association of Farmers’ Markets and spatially joined to BC map at Dissemination Area (DA) level using ArcGIS software to link the farmers’ market to the respective communities that they serve. Then, in order to investigate this issue and understand which rural communities farmer’ markets tend to operate, a binary logistic regression analysis was performed with the availability of farmer’ markets at DA-level as dependent variable and Deprivation Index (DI), Metro Influence Zone (MIZ) and population as independent variables. The results indicated that DI and MIZ variables are not statistically significant whereas the population is the only which had a significant contribution in predicting the availability of farmers’ markets in rural BC. Moreover, this study found that farmers’ markets usually do not operate in rural food deserts where other healthy food providers such as supermarkets and grocery stores are non-existent. In conclusion, the presence of farmers markets is not associated with socioeconomic and geographic characteristics of rural communities in BC, but farmers’ markets tend to operate in more populated rural communities in BC.Keywords: farmers’ markets, socioeconomic and demographic variables, metro influence zone, logistic regression, ArcGIS
Procedia PDF Downloads 1871572 The Impact of Surface Roughness and PTFE/TiF3/FeF3 Additives in Plain ZDDP Oil on the Friction and Wear Behavior Using Thermal and Tribological Analysis under Extreme Pressure Condition
Authors: Gabi N. Nehme, Saeed Ghalambor
Abstract:
The use of titanium fluoride and iron fluoride (TiF3/FeF3) catalysts in combination with polutetrafluoroethylene (PTFE) in plain zinc dialkyldithiophosphate (ZDDP) oil is important for the study of engine tribocomponents and is increasingly a strategy to improve the formation of tribofilm and to provide low friction and excellent wear protection in reduced phosphorus plain ZDDP oil. The influence of surface roughness and the concentration of TiF3/FeF3/PTFE were investigated using bearing steel samples dipped in lubricant solution @100°C for two different heating time durations. This paper addresses the effects of water drop contact angle using different surface finishes after treating them with different lubricant combination. The calculated water drop contact angles were analyzed using Design of Experiment software (DOE) and it was determined that a 0.05 μm Ra surface roughness would provide an excellent TiF3/FeF3/PTFE coating for antiwear resistance as reflected in the scanning electron microscopy (SEM) images and the tribological testing under extreme pressure conditions. Both friction and wear performance depend greatly on the PTFE/and catalysts in plain ZDDP oil with 0.05% phosphorous and on the surface finish of bearing steel. The friction and wear reducing effects, which was observed in the tribological tests, indicated a better micro lubrication effect of the 0.05 μm Ra surface roughness treated at 100°C for 24 hours when compared to the 0.1 μm Ra surface roughness with the same treatment.Keywords: scanning electron microscopy, ZDDP, catalysts, PTFE, friction, wear
Procedia PDF Downloads 3491571 Analyzing the Effect of Materials’ Selection on Energy Saving and Carbon Footprint: A Case Study Simulation of Concrete Structure Building
Authors: M. Kouhirostamkolaei, M. Kouhirostami, M. Sam, J. Woo, A. T. Asutosh, J. Li, C. Kibert
Abstract:
Construction is one of the most energy consumed activities in the urban environment that results in a significant amount of greenhouse gas emissions around the world. Thus, the impact of the construction industry on global warming is undeniable. Thus, reducing building energy consumption and mitigating carbon production can slow the rate of global warming. The purpose of this study is to determine the amount of energy consumption and carbon dioxide production during the operation phase and the impact of using new shells on energy saving and carbon footprint. Therefore, a residential building with a re-enforced concrete structure is selected in Babolsar, Iran. DesignBuilder software has been used for one year of building operation to calculate the amount of carbon dioxide production and energy consumption in the operation phase of the building. The primary results show the building use 61750 kWh of energy each year. Computer simulation analyzes the effect of changing building shells -using XPS polystyrene and new electrochromic windows- as well as changing the type of lighting on energy consumption reduction and subsequent carbon dioxide production. The results show that the amount of energy and carbon production during building operation has been reduced by approximately 70% by applying the proposed changes. The changes reduce CO2e to 11345 kg CO2/yr. The result of this study helps designers and engineers to consider material selection’s process as one of the most important stages of design for improving energy performance of buildings.Keywords: construction materials, green construction, energy simulation, carbon footprint, energy saving, concrete structure, designbuilder
Procedia PDF Downloads 1961570 Demand for Care in Primary Health Care in the Governorate of Ariana: Results of a Survey in Ariana Primary Health Care and Comparison with the Last 30 Years
Authors: Chelly Souhir, Harizi Chahida, Hachaichi Aicha, Aissaoui Sihem, Chahed Mohamed Kouni
Abstract:
Introduction: In Tunisia, few studies have attempted to describe the demand for primary care in a standardized and systematic way. The purpose of this study is to describe the main reasons for demand for care in primary health care, through a survey of the Ariana Governorate PHC and to identify their evolutionary trend compared to last 30 years, reported by studies of the same type. Materials and methods: This is a cross-sectional descriptive study which concerns the study of consultants in the first line of the governorate of Ariana and their use of care recorded during 2 days in the same week during the month of May 2016, in each of these PHC. The same data collection sheet was used in all CSBs. The coding of the information was done according to the International Classification of Primary Care (ICPC). The data was entered and analyzed by the EPI Info 7 software. Results: Our study found that the most common ICPC chapters are respiratory (42%) and digestive (13.2%). In 1996 were the respiratory (43.5%) and circulatory (7.8%). In 2000, we found also the respiratory (39,6%) and circulatory (10,9%). In 2002, respiratory (43%) and digestive (10.1%) motives were the most frequent. According to the ICPC, the pathologies in our study were acute angina (19%), acute bronchitis and bronchiolitis (8%). In 1996, it was tonsillitis ( 21.6%) and acute bronchitis (7.2%). For Ben Abdelaziz in 2000, tonsillitis (14.5%) follow by acute bronchitis (8.3%). In 2002, acute angina (15.7%), acute bronchitis and bronchiolitis (11.2%) were the most common. Conclusion: Acute angina and tonsillitis are the most common in all studies conducted in Tunisia.Keywords: acute angina, classification of primary care, primary health care, tonsillitis, Tunisia
Procedia PDF Downloads 5291569 Numerical Modelling of Immiscible Fluids Flow in Oil Reservoir Rocks during Enhanced Oil Recovery Processes
Authors: Zahreddine Hafsi, Manoranjan Mishra , Sami Elaoud
Abstract:
Ensuring the maximum recovery rate of oil from reservoir rocks is a challenging task that requires preliminary numerical analysis of different techniques used to enhance the recovery process. After conventional oil recovery processes and in order to retrieve oil left behind after the primary recovery phase, water flooding in one of several techniques used for enhanced oil recovery (EOR). In this research work, EOR via water flooding is numerically modeled, and hydrodynamic instabilities resulted from immiscible oil-water flow in reservoir rocks are investigated. An oil reservoir is a porous medium consisted of many fractures of tiny dimensions. For modeling purposes, the oil reservoir is considered as a collection of capillary tubes which provides useful insights into how fluids behave in the reservoir pore spaces. Equations governing oil-water flow in oil reservoir rocks are developed and numerically solved following a finite element scheme. Numerical results are obtained using Comsol Multiphysics software. The two phase Darcy module of COMSOL Multiphysics allows modelling the imbibition process by the injection of water (as wetting phase) into an oil reservoir. Van Genuchten, Brooks Corey and Levrett models were considered as retention models and obtained flow configurations are compared, and the governing parameters are discussed. For the considered retention models it was found that onset of instabilities viz. fingering phenomenon is highly dependent on the capillary pressure as well as the boundary conditions, i.e., the inlet pressure and the injection velocity.Keywords: capillary pressure, EOR process, immiscible flow, numerical modelling
Procedia PDF Downloads 1291568 Parametric Models of Facade Designs of High-Rise Residential Buildings
Authors: Yuchen Sharon Sung, Yingjui Tseng
Abstract:
High-rise residential buildings have become the most mainstream housing pattern in the world’s metropolises under the current trend of urbanization. The facades of high-rise buildings are essential elements of the urban landscape. The skins of these facades are important media between the interior and exterior of high- rise buildings. It not only connects between users and environments, but also plays an important functional and aesthetic role. This research involves a study of skins of high-rise residential buildings using the methodology of shape grammar to find out the rules which determine the combinations of the facade patterns and analyze the patterns’ parameters using software Grasshopper. We chose a number of facades of high-rise residential buildings as source to discover the underlying rules and concepts of the generation of facade skins. This research also provides the rules that influence the composition of facade skins. The items of the facade skins, such as windows, balconies, walls, sun visors and metal grilles are treated as elements in the system of facade skins. The compositions of these elements will be categorized and described by logical rules; and the types of high-rise building facade skins will be modelled by Grasshopper. Then a variety of analyzed patterns can also be applied on other facade skins through this parametric mechanism. Using these patterns established in the models, researchers can analyze each single item to do more detail tests and architects can apply each of these items to construct their facades for other buildings through various combinations and permutations. The goal of these models is to develop a mechanism to generate prototypes in order to facilitate generation of various facade skins.Keywords: facade skin, grasshopper, high-rise residential building, shape grammar
Procedia PDF Downloads 5071567 Analysis of Computer Science Papers Conducted by Board of Intermediate and Secondary Education at Secondary Level
Authors: Ameema Mahroof, Muhammad Saeed
Abstract:
The purpose of this study was to analyze the papers of computer science conducted by Board of Intermediate and Secondary Education with reference to Bloom’s taxonomy. The present study has two parts. First, the analysis is done on the papers conducted by Board of Intermediate of Secondary Education on the basis of basic rules of item construction especially Bloom’s (1956). And the item analysis is done to improve the psychometric properties of a test. The sample included the question papers of computer science of higher secondary classes (XI-XII) for the years 2011 and 2012. For item analysis, the data was collected from 60 students through convenient sampling. Findings of the study revealed that in the papers by Board of intermediate and secondary education the maximum focus was on knowledge and understanding level and very less focus was on the application, analysis, and synthesis. Furthermore, the item analysis on the question paper reveals that item difficulty of most of the questions did not show a balanced paper, the items were either very difficult while most of the items were too easy (measuring knowledge and understanding abilities). Likewise, most of the items were not truly discriminating the high and low achievers; four items were even negatively discriminating. The researchers also analyzed the items of the paper through software Conquest. These results show that the papers conducted by Board of Intermediate and Secondary Education were not well constructed. It was recommended that paper setters should be trained in developing the question papers that can measure various cognitive abilities of students so that a good paper in computer science should assess all cognitive abilities of students.Keywords: Bloom’s taxonomy, question paper, item analysis, cognitive domain, computer science
Procedia PDF Downloads 1481566 Photoelastic Analysis and Finite Elements Analysis of a Stress Field Developed in a Double Edge Notched Specimen
Authors: A. Bilek, M. Beldi, T. Cherfi, S. Djebali, S. Larbi
Abstract:
Finite elements analysis and photoelasticity are used to determine the stress field developed in a double edge notched specimen loaded in tension. The specimen is cut in a birefringent plate. Experimental isochromatic fringes are obtained with circularly polarized light on the analyzer of a regular polariscope. The fringes represent the loci of points of equal maximum shear stress. In order to obtain the stress values corresponding to the fringe orders recorded in the notched specimen, particularly in the neighborhood of the notches, a calibrating disc made of the same material is loaded in compression along its diameter in order to determine the photoelastic fringe value. This fringe value is also used in the finite elements solution in order to obtain the simulated photoelastic fringes, the isochromatics as well as the isoclinics. A color scale is used by the software to represent the simulated fringes on the whole model. The stress concentration factor can be readily obtained at the notches. Good agreements are obtained between the experimental and the simulated fringe patterns and between the graphs of the shear stress particularly in the neighborhood of the notches. The purpose in this paper is to show that one can obtain rapidly and accurately, by the finite element analysis, the isochromatic and the isoclinic fringe patterns in a stressed model as the experimental procedure can be time consuming. Stress fields can therefore be analyzed in three dimensional models as long as the meshing and the limit conditions are properly set in the program.Keywords: isochromatic fringe, isoclinic fringe, photoelasticity, stress concentration factor
Procedia PDF Downloads 2291565 Automated Natural Hazard Zonation System with Internet-SMS Warning: Distributed GIS for Sustainable Societies Creating Schema and Interface for Mapping and Communication
Authors: Devanjan Bhattacharya, Jitka Komarkova
Abstract:
The research describes the implementation of a novel and stand-alone system for dynamic hazard warning. The system uses all existing infrastructure already in place like mobile networks, a laptop/PC and the small installation software. The geospatial dataset are the maps of a region which are again frugal. Hence there is no need to invest and it reaches everyone with a mobile. A novel architecture of hazard assessment and warning introduced where major technologies in ICT interfaced to give a unique WebGIS based dynamic real time geohazard warning communication system. A never before architecture introduced for integrating WebGIS with telecommunication technology. Existing technologies interfaced in a novel architectural design to address a neglected domain in a way never done before–through dynamically updatable WebGIS based warning communication. The work publishes new architecture and novelty in addressing hazard warning techniques in sustainable way and user friendly manner. Coupling of hazard zonation and hazard warning procedures into a single system has been shown. Generalized architecture for deciphering a range of geo-hazards has been developed. Hence the developmental work presented here can be summarized as the development of internet-SMS based automated geo-hazard warning communication system; integrating a warning communication system with a hazard evaluation system; interfacing different open-source technologies towards design and development of a warning system; modularization of different technologies towards development of a warning communication system; automated data creation, transformation and dissemination over different interfaces. The architecture of the developed warning system has been functionally automated as well as generalized enough that can be used for any hazard and setup requirement has been kept to a minimum.Keywords: geospatial, web-based GIS, geohazard, warning system
Procedia PDF Downloads 4071564 Urban Change Detection and Pattern Analysis Using Satellite Data
Authors: Shivani Jha, Klaus Baier, Rafiq Azzam, Ramakar Jha
Abstract:
In India, generally people migrate from rural area to the urban area for better infra-structural facilities, high standard of living, good job opportunities and advanced transport/communication availability. In fact, unplanned urban development due to migration of people causes seriou damage to the land use, water pollution and available water resources. In the present work, an attempt has been made to use satellite data of different years for urban change detection of Chennai metropolitan city along with pattern analysis to generate future scenario of urban development using buffer zoning in GIS environment. In the analysis, SRTM (30m) elevation data and IRS-1C satellite data for the years 1990, 2000, and 2014, are used. The flow accumulation, aspect, flow direction and slope maps developed using SRTM 30 m data are very useful for finding suitable urban locations for industrial setup and urban settlements. Normalized difference vegetation index (NDVI) and Principal Component Analysis (PCA) have been used in ERDAS imagine software for change detection in land use of Chennai metropolitan city. It has been observed that the urban area has increased exponentially in Chennai metropolitan city with significant decrease in agriculture and barren lands. However, the water bodies located in the study regions are protected and being used as freshwater for drinking purposes. Using buffer zone analysis in GIS environment, it has been observed that the development has taken place in south west direction significantly and will do so in future.Keywords: urban change, satellite data, the Chennai metropolis, change detection
Procedia PDF Downloads 4071563 A BIM-Based Approach to Assess COVID-19 Risk Management Regarding Indoor Air Ventilation and Pedestrian Dynamics
Authors: T. Delval, C. Sauvage, Q. Jullien, R. Viano, T. Diallo, B. Collignan, G. Picinbono
Abstract:
In the context of the international spread of COVID-19, the Centre Scientifique et Technique du Bâtiment (CSTB) has led a joint research with the French government authorities Hauts-de-Seine department, to analyse the risk in school spaces according to their configuration, ventilation system and spatial segmentation strategy. This paper describes the main results of this joint research. A multidisciplinary team involving experts in indoor air quality/ventilation, pedestrian movements and IT domains was established to develop a COVID risk analysis tool based on Building Information Model. The work started with specific analysis on two pilot schools in order to provide for the local administration specifications to minimize the spread of the virus. Different recommendations were published to optimize/validate the use of ventilation systems and the strategy of student occupancy and student flow segmentation within the building. This COVID expertise has been digitized in order to manage a quick risk analysis on the entire building that could be used by the public administration through an easy user interface implemented in a free BIM Management software. One of the most interesting results is to enable a dynamic comparison of different ventilation system scenarios and space occupation strategy inside the BIM model. This concurrent engineering approach provides users with the optimal solution according to both ventilation and pedestrian flow expertise.Keywords: BIM, knowledge management, system expert, risk management, indoor ventilation, pedestrian movement, integrated design
Procedia PDF Downloads 1061562 Flood Hazard Impact Based on Simulation Model of Potential Flood Inundation in Lamong River, Gresik Regency
Authors: Yunita Ratih Wijayanti, Dwi Rahmawati, Turniningtyas Ayu Rahmawati
Abstract:
Gresik is one of the districts in East Java Province, Indonesia. Gresik Regency has three major rivers, namely Bengawan Solo River, Brantas River, and Lamong River. Lamong River is a tributary of Bengawan Solo River. Flood disasters that occur in Gresik Regency are often caused by the overflow of the Lamong River. The losses caused by the flood were very large and certainly detrimental to the affected people. Therefore, to be able to minimize the impact caused by the flood, it is necessary to take preventive action. However, before taking preventive action, it is necessary to have information regarding potential inundation areas and water levels at various points. For this reason, a flood simulation model is needed. In this study, the simulation was carried out using the Geographic Information System (GIS) method with the help of Global Mapper software. The approach used in this simulation is to use a topographical approach with Digital Elevation Models (DEMs) data. DEMs data have been widely used for various researches to analyze hydrology. The results obtained from this flood simulation are the distribution of flood inundation and water level. The location of the inundation serves to determine the extent of the flooding that occurs by referring to the 50-100 year flood plan, while the water level serves to provide early warning information. Both will be very useful to find out how much loss will be caused in the future due to flooding in Gresik Regency so that the Gresik Regency Regional Disaster Management Agency can take precautions before the flood disaster strikes.Keywords: flood hazard, simulation model, potential inundation, global mapper, Gresik Regency
Procedia PDF Downloads 821561 Object Recognition System Operating from Different Type Vehicles Using Raspberry and OpenCV
Authors: Maria Pavlova
Abstract:
In our days, it is possible to put the camera on different vehicles like quadcopter, train, airplane and etc. The camera also can be the input sensor in many different systems. That means the object recognition like non separate part of monitoring control can be key part of the most intelligent systems. The aim of this paper is to focus of the object recognition process during vehicles movement. During the vehicle’s movement the camera takes pictures from the environment without storage in Data Base. In case the camera detects a special object (for example human or animal), the system saves the picture and sends it to the work station in real time. This functionality will be very useful in emergency or security situations where is necessary to find a specific object. In another application, the camera can be mounted on crossroad where do not have many people and if one or more persons come on the road, the traffic lights became the green and they can cross the road. In this papers is presented the system has solved the aforementioned problems. It is presented architecture of the object recognition system includes the camera, Raspberry platform, GPS system, neural network, software and Data Base. The camera in the system takes the pictures. The object recognition is done in real time using the OpenCV library and Raspberry microcontroller. An additional feature of this library is the ability to display the GPS coordinates of the captured objects position. The results from this processes will be sent to remote station. So, in this case, we can know the location of the specific object. By neural network, we can learn the module to solve the problems using incoming data and to be part in bigger intelligent system. The present paper focuses on the design and integration of the image recognition like a part of smart systems.Keywords: camera, object recognition, OpenCV, Raspberry
Procedia PDF Downloads 2171560 Investigating the Effect of Brand Equity on Competitive Advantage in the Banking Industry
Authors: Rohollah Asadian Kohestani, Nazanin Sedghi
Abstract:
As the number of banks and financial institutions working in Iran has been significantly increased, the attracting and retaining customers and encouraging them to continually use the modern banking services have been important and vital issues. Therefore, there would be a serious competition without a deep perception of consumers and fitness of banking services with their needs in the current economic conditions of Iran. It should be noted that concepts such as 'brand equity' is defined based on the view of consumers; however, it is also focused by shareholders, competitors and other beneficiaries of a firm in addition to bank and its consumers. This study examines the impact of brand equity on the competitive advantage in the banking industry as intensive competition between brands of different banks leads to pay more attention to the brands. This research is based on the Aaker’s model examining the impact of four dimensions of brand equity on the competitive advantage of private banks in Behshahr city. Moreover, conducting an applied research and data analysis has been carried out by a descriptive method. Data collection was done using literature review and questionnaire. A 'simple random' methodology was selected for sampling staff of banks while sampling methodology to select consumers of banks was the distribution of questionnaire between staff and consumers of five private banks including Tejarat, Mellat, Refah K., Ghavamin and, Tose’e Ta’avon banks. Results show that there is a significant relationship between brand equity and their competitive advantage. In this research, software of SPSS 16 and LISREL 8.5, as well as different methods of descriptive inferential statistics for analyzing data and test hypotheses, were employed.Keywords: brand awareness, brand loyalty, brand equity, competitive advantage
Procedia PDF Downloads 1371559 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 1551558 Evaluation of Virtual Reality for the Rehabilitation of Athlete Lower Limb Musculoskeletal Injury: A Method for Obtaining Practitioner’s Viewpoints through Observation and Interview
Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes
Abstract:
Based on a theoretical assessment of current literature, virtual reality (VR) could help to treat sporting injuries in a number of ways. However, it is important to obtain rehabilitation specialists’ perspectives in order to design, develop and validate suitable content for a VR application focused on treatment. Subsequently, a one-day observation and interview study focused on the use of VR for the treatment of lower limb musculoskeletal conditions in athletes was conducted at St George’s Park England National Football Centre with rehabilitation specialists. The current paper established the methods suitable for obtaining practitioner’s viewpoints through observation and interview in this context. Particular detail was provided regarding the method of qualitatively processing interview results using the qualitative data analysis software tool NVivo, in order to produce a narrative of overarching themes. The observations and overarching themes identified could be used as a framework and success criteria of a VR application developed in future research. In conclusion, this work explained the methods deemed suitable for obtaining practitioner’s viewpoints through observation and interview. This was required in order to highlight characteristics and features of a VR application designed to treat lower limb musculoskeletal injury of athletes and could be built upon to direct future work.Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality
Procedia PDF Downloads 2551557 Organotin (IV) Based Complexes as Promiscuous Antibacterials: Synthesis in vitro, in Silico Pharmacokinetic, and Docking Studies
Authors: Wajid Rehman, Sirajul Haq, Bakhtiar Muhammad, Syed Fahad Hassan, Amin Badshah, Muhammad Waseem, Fazal Rahim, Obaid-Ur-Rahman Abid, Farzana Latif Ansari, Umer Rashid
Abstract:
Five novel triorganotin (IV) compounds have been synthesized and characterized. The tin atom is penta-coordinated to assume trigonal-bipyramidal geometry. Using in silico derived parameters; the objective of our study is to design and synthesize promiscuous antibacterials potent enough to combat resistance. Among various synthesized organotin (IV) complexes, compound 5 was found as potent antibacterial agent against various bacterial strains. Further lead optimization of drug-like properties was evaluated through in silico predictions. Data mining and computational analysis were utilized to derive compound promiscuity phenomenon to avoid drug attrition rate in designing antibacterials. Xanthine oxidase and human glucose- 6-phosphatase were found as only true positive off-target hits by ChEMBL database and others utilizing similarity ensemble approach. Propensity towards a-3 receptor, human macrophage migration factor and thiazolidinedione were found as false positive off targets with E-value 1/4> 10^-4 for compound 1, 3, and 4. Further, displaying positive drug-drug interaction of compound 1 as uricosuric was validated by all databases and docked protein targets with sequence similarity and compositional matrix alignment via BLAST software. Promiscuity of the compound 5 was further confirmed by in silico binding to different antibacterial targets.Keywords: antibacterial activity, drug promiscuity, ADMET prediction, metallo-pharmaceutical, antimicrobial resistance
Procedia PDF Downloads 5011556 Progressive Collapse of Cooling Towers
Authors: Esmaeil Asadzadeh, Mehtab Alam
Abstract:
Well documented records of the past failures of the structures reveals that the progressive collapse of structures is one of the major reasons for dramatic human loss and economical consequences. Progressive collapse is the failure mechanism in which the structure fails gradually due to the sudden removal of the structural elements. The sudden removal of some structural elements results in the excessive redistributed loads on the others. This sudden removal may be caused by any sudden loading resulted from local explosion, impact loading and terrorist attacks. Hyperbolic thin walled concrete shell structures being an important part of nuclear and thermal power plants are always prone to such terrorist attacks. In concrete structures, the gradual failure would take place by generation of initial cracks and its propagation in the supporting columns along with the tower shell leading to the collapse of the entire structure. In this study the mechanism of progressive collapse for such high raised towers would be simulated employing the finite element method. The aim of this study would be providing clear conceptual step-by-step descriptions of various procedures for progressive collapse analysis using commercially available finite element structural analysis software’s, with the aim that the explanations would be clear enough that they will be readily understandable and will be used by practicing engineers. The study would be carried out in the following procedures: 1. Provide explanations of modeling, simulation and analysis procedures including input screen snapshots; 2. Interpretation of the results and discussions; 3. Conclusions and recommendations.Keywords: progressive collapse, cooling towers, finite element analysis, crack generation, reinforced concrete
Procedia PDF Downloads 4791555 Optimization of Multi Commodities Consumer Supply Chain: Part 1-Modelling
Authors: Zeinab Haji Abolhasani, Romeo Marian, Lee Luong
Abstract:
This paper and its companions (Part II, Part III) will concentrate on optimizing a class of supply chain problems known as Multi- Commodities Consumer Supply Chain (MCCSC) problem. MCCSC problem belongs to production-distribution (P-D) planning category. It aims to determine facilities location, consumers’ allocation, and facilities configuration to minimize total cost (CT) of the entire network. These facilities can be manufacturer units (MUs), distribution centres (DCs), and retailers/end-users (REs) but not limited to them. To address this problem, three major tasks should be undertaken. At the first place, a mixed integer non-linear programming (MINP) mathematical model is developed. Then, system’s behaviors under different conditions will be observed using a simulation modeling tool. Finally, the most optimum solution (minimum CT) of the system will be obtained using a multi-objective optimization technique. Due to the large size of the problem, and the uncertainties in finding the most optimum solution, integration of modeling and simulation methodologies is proposed followed by developing new approach known as GASG. It is a genetic algorithm on the basis of granular simulation which is the subject of the methodology of this research. In part II, MCCSC is simulated using discrete-event simulation (DES) device within an integrated environment of SimEvents and Simulink of MATLAB® software package followed by a comprehensive case study to examine the given strategy. Also, the effect of genetic operators on the obtained optimal/near optimal solution by the simulation model will be discussed in part III.Keywords: supply chain, genetic algorithm, optimization, simulation, discrete event system
Procedia PDF Downloads 3151554 Contribution of Upper Body Kinematics on Tennis Serve Performance
Authors: Ikram Hussain, Fuzail Ahmad, Tawseef Ahmad Bhat
Abstract:
Tennis serve is characterized as one of the most prominent techniques pertaining to the success of winning a point. The study was aimed to explore the contributions of the upper body kinematics on the tennis performance during Davis Cup (Oceania Group). Four Indian International tennis players who participated in the Davis Cup held at Indore, India were inducted as the subjects for this study, with mean age 27 ± 4.79 Years, mean weight 186 ± 6.03 cm, mean weight 81.25 ± 7.41kg, respectively. The tennis serve was bifurcated into three phases viz, preparatory phase, force generation phase and follow through phase. The kinematic data for the study was recorded through the high speed canon camcorder having a shuttle speed of 1/2000, at a frame rate of 50 Hz. The data was analysed with the motion analysis software. The descriptive statistics and F-test was employed through SPSS version 17.0 for the determination of the undertaken kinematic parameters of the study, and was computed at a 0.05 level of significance with 46 degrees of freedom. Mean, standard deviation and correlation coefficient also employed to find out the relationship among the upper body kinematic parameter and performance. In the preparatory phase, the analysis revealed that no significant difference exists among the kinematic parameters of the players on the performance. However, in force generation phase, wrist velocity (r= 0.47), torso velocity (r= -0.53), racket velocity r= 0.60), and in follow through phase, torso acceleration r= 0.43), elbow angle (r= -0.48) play a significant role on the performance of the tennis serve. Therefore, players should ponder upon the velocities of the above segments at the time of preparation for the competitions.Keywords: Davis Cup, kinematics, motion analysis, tennis serve
Procedia PDF Downloads 3001553 Digital Repository as a Service: Enhancing Access and Preservation of Cultural Heritage Artefacts
Authors: Lefteris Tsipis, Demosthenes Vouyioukas, George Loumos, Antonis Kargas, Dimitris Varoutas
Abstract:
The employment of technology and digitization is crucial for cultural organizations to establish and sustain digital repositories for their cultural heritage artefacts. This utilization is also essential in facilitating the presentation of cultural works and exhibits to a broader audience. Consequently, in this work, we propose a digital repository that functions as Software as a Service (SaaS), primarily promoting the safe storage, display, and sharing of cultural materials, enhancing accessibility, and fostering a deeper understanding and appreciation of cultural heritage. Moreover, the proposed digital repository service is designed as a multitenant architecture, which enables organizations to expand their reach, enhance accessibility, foster collaboration, and ensure the preservation of their content. Specifically, this project aims to assist each cultural institution in organizing its digital cultural assets into collections and feeding other digital platforms, including educational, museum, pedagogical, and games, through appropriate interfaces. Moreover, the creation of this digital repository offers a cutting-edge and effective open-access laboratory solution. It allows organizations to have a significant influence on their audiences by fostering cultural understanding and appreciation. Additionally, it facilitates the connection between different digital repositories and national/European aggregators, promoting collaboration and information sharing. By embracing this solution, cultural institutions can benefit from shared resources and features, such as system updates, backup and recovery services, and data analytics tools, that are provided by the platform.Keywords: cultural technologies, gaming technologies, web sharing, digital repository
Procedia PDF Downloads 781552 Mudlogging, a Key Tool in Effective Well Delivery: A Case Study of Bisas Field Niger Delta, Nigeria
Authors: Segun Steven Bodunde
Abstract:
Mudlogging is the continuous analysis of rock cuttings and drilling fluids to ascertain the presence or absence of oil and gas from the formation penetrated by the drilling bit. This research highlighted a case study of Well BSS-99ST from ‘Bisas Field’, Niger Delta, with depth extending from 1950m to 3640m (Measured Depth). It was focused on identifying the lithologies encountered at specified depth intervals and to accurately delineate the targeted potential reservoir on the field and prepare the lithology and Master log. Equipment such as the Microscope, Fluoroscope, spin drier, oven, and chemicals, which includes: hydrochloric acid, chloroethene, and phenolphthalein, were used to check the cuttings for their calcareous nature, for oil show and for the presence of Cement respectively. Gas analysis was done using the gas chromatograph and the Flame Ionization Detector, which was connected to the Total Hydrocarbon Analyzer (THA). Drilling Parameters and Gas concentration logs were used alongside the lithology log to predict and accurately delineate the targeted reservoir on the field. The result showed continuous intercalation of sand and shale, with the presence of small quantities of siltstone at a depth of 2300m. The lithology log was generated using Log Plot software. The targeted reservoir was identified between 3478m to 3510m after inspection of the gas analysis, lithology log, electric logs, and the drilling parameters. Total gas of about 345 units and five Alkane Gas components were identified in the specific depth range. A comparative check with the Gamma ray log from the well further confirmed the lithologic sequence and the accurate delineation of the targeted potential reservoir using mudlogging.Keywords: mudlogging, chromatograph, drilling fluids, calcareous
Procedia PDF Downloads 1481551 Effect of Fuel Injection Discharge Curve and Injection Pressure on Upgrading Power and Combustion Parameters in HD Diesel Engine with CFD Simulation
Authors: Saeed Chamehsara, Seyed Mostafa Mirsalim, Mehdi Tajdari
Abstract:
In this study, the effect of fuel injection discharge curve and injection pressure simultaneously for upgrading power of heavy duty diesel engine by simulation of combustion process in AVL-Fire software are discussed. Hence, the fuel injection discharge curve was changed from semi-triangular to rectangular which is usual in common rail fuel injection system. Injection pressure with respect to amount of injected fuel and nozzle hole diameter are changed. Injection pressure is calculated by an experimental equation which is for heavy duty diesel engines with common rail fuel injection system. Upgrading power for 1000 and 2000 bar injection pressure are discussed. For 1000 bar injection pressure with 188 mg injected fuel and 3 mm nozzle hole diameter in compare with first state which is semi-triangular discharge curve with 139 mg injected fuel and 3 mm nozzle hole diameter, upgrading power is about 19% whereas the special change has not been observed in cylinder pressure. On the other hand, both the NOX emission and the Soot emission decreased about 30% and 6% respectively. Compared with first state, for 2000 bar injection pressure that injected fuel and nozzle diameter are 196 mg and 2.6 mm respectively, upgrading power is about 22% whereas cylinder pressure has been fixed and NOX emission and the Soot emissions are decreased 36% and 20%, respectively.Keywords: CFD simulation, HD diesel engine, upgrading power, injection pressure, fuel injection discharge curve, combustion process
Procedia PDF Downloads 5211550 The Impact of Behavioral Factors on the Decision Making of Real Estate Investor of Pakistan
Authors: Khalid Bashir, Hammad Zahid
Abstract:
Most of the investors consider that economic and financial information is the most important at the time of making investment decisions. But it is not true, as in the past two decades, the Behavioral aspects and the behavioral biases have gained an important place in the decision-making process of an investor. This study is basically conducted on this fact. The purpose of this study is to examine the impact of behavioral factors on the decision-making of the individual real estate investor in Pakistan. Some important behavioral factors like overconfidence, anchoring, gambler’s fallacy, home bias, loss aversion, regret aversion, mental accounting, herding and representativeness are used in this study to find their impact on the psychology of individual investors. The targeted population is the real estate investor of Pakistan, and a sample of 650 investors is selected on the basis of convenience sampling technique. The data is collected through the questionnaire with a response rate of 46.15 %. Descriptive statistical techniques and SEM are used to analyze the data by using statistical software. The results revealed the fact that some behavioral factors have a significant impact on the decision-making of investors. Among all the behavioral biases, overconfidence, anchoring, gambler’s fallacy, loss aversion and representativeness have a significant positive impact on the decision-making of the individual investor, while the rest of biases like home bias, regret aversion, mental accounting, herding have less impact on the decision-making process of an individual.Keywords: behavioral finance, anchoring, gambler’s fallacy, loss aversion
Procedia PDF Downloads 691549 Automatic Detection and Filtering of Negative Emotion-Bearing Contents from Social Media in Amharic Using Sentiment Analysis and Deep Learning Methods
Authors: Derejaw Lake Melie, Alemu Kumlachew Tegegne
Abstract:
The increasing prevalence of social media in Ethiopia has exacerbated societal challenges by fostering the proliferation of negative emotional posts and comments. Illicit use of social media has further exacerbated divisions among the population. Addressing these issues through manual identification and aggregation of emotions from millions of users for swift decision-making poses significant challenges, particularly given the rapid growth of Amharic language usage on social platforms. Consequently, there is a critical need to develop an intelligent system capable of automatically detecting and categorizing negative emotional content into social, religious, and political categories while also filtering out toxic online content. This paper aims to leverage sentiment analysis techniques to achieve automatic detection and filtering of negative emotional content from Amharic social media texts, employing a comparative study of deep learning algorithms. The study utilized a dataset comprising 29,962 comments collected from social media platforms using comment exporter software. Data pre-processing techniques were applied to enhance data quality, followed by the implementation of deep learning methods for training, testing, and evaluation. The results showed that CNN, GRU, LSTM, and Bi-LSTM classification models achieved accuracies of 83%, 50%, 84%, and 86%, respectively. Among these models, Bi-LSTM demonstrated the highest accuracy of 86% in the experiment.Keywords: negative emotion, emotion detection, social media filtering sentiment analysis, deep learning.
Procedia PDF Downloads 211548 Additive Manufacturing’s Impact on Product Design and Development: An Industrial Case Study
Authors: Ahmed Abdelsalam, Daniel Roozbahani, Marjan Alizadeh, Heikki Handroos
Abstract:
The aim of this study was to redesign a pressing air nozzle with lower weight and improved efficiency utilizing Selective Laser Melting (SLM) technology based on Design for Additive Manufacturing (DfAM) methods. The original pressing air nozzle was modified in SolidWorks 3D CAD, and two design concepts were introduced considering the DfAM approach. In the proposed designs, the air channels were amended. 3D models for the original pressing air nozzle and introduced designs were created to obtain the flow characteristic data using Ansys software. Results of CFD modeling for the original and two proposed designs were extracted, compared, and analyzed to demonstrate the impact of design on the development of a more efficient pressing air nozzle by AM process. Improved airflow was achieved by optimizing the pressing air nozzle's internal channel for both design concepts by providing 30% and 50.6% fewer pressure drops than the original design. Moreover, utilizing the presented designs, a significant reduction in product weight was attained. In addition, by applying the proposed designs, 48.3% and 70.3% reduction in product weight was attained compared to the original design. Therefore, pressing air nozzle with enhanced productivity and lowered weight was generated utilizing the DfAM-driven designs developed in this study. The main contribution of this study is to investigate the additional possibilities that can be achieved in designing modern parts using the advantage of SLM technology in producing that part. The approach presented in this study can be applied to almost any similar industrial application.Keywords: additive manufacturing, design for additive manufacturing, design methods, product design, pressing air nozzle
Procedia PDF Downloads 1621547 Copper Price Prediction Model for Various Economic Situations
Authors: Haidy S. Ghali, Engy Serag, A. Samer Ezeldin
Abstract:
Copper is an essential raw material used in the construction industry. During the year 2021 and the first half of 2022, the global market suffered from a significant fluctuation in copper raw material prices due to the aftermath of both the COVID-19 pandemic and the Russia-Ukraine war, which exposed its consumers to an unexpected financial risk. Thereto, this paper aims to develop two ANN-LSTM price prediction models, using Python, that can forecast the average monthly copper prices traded in the London Metal Exchange; the first model is a multivariate model that forecasts the copper price of the next 1-month and the second is a univariate model that predicts the copper prices of the upcoming three months. Historical data of average monthly London Metal Exchange copper prices are collected from January 2009 till July 2022, and potential external factors are identified and employed in the multivariate model. These factors lie under three main categories: energy prices and economic indicators of the three major exporting countries of copper, depending on the data availability. Before developing the LSTM models, the collected external parameters are analyzed with respect to the copper prices using correlation and multicollinearity tests in R software; then, the parameters are further screened to select the parameters that influence the copper prices. Then, the two LSTM models are developed, and the dataset is divided into training, validation, and testing sets. The results show that the performance of the 3-Month prediction model is better than the 1-Month prediction model, but still, both models can act as predicting tools for diverse economic situations.Keywords: copper prices, prediction model, neural network, time series forecasting
Procedia PDF Downloads 111