Search results for: deep vibro techniques
7129 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security
Authors: Shanshan Zhu, Mohammad Nasim
Abstract:
Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection
Procedia PDF Downloads 427128 Hydrocarbons and Diamondiferous Structures Formation in Different Depths of the Earth Crust
Authors: A. V. Harutyunyan
Abstract:
The investigation results of rocks at high pressures and temperatures have revealed the intervals of changes of seismic waves and density, as well as some processes taking place in rocks. In the serpentinized rocks, as a consequence of dehydration, abrupt changes in seismic waves and density have been recorded. Hydrogen-bearing components are released which combine with carbon-bearing components. As a result, hydrocarbons formed. The investigated samples are smelted. Then, geofluids and hydrocarbons migrate into the upper horizons of the Earth crust by the deep faults. Then their differentiation and accumulation in the jointed rocks of the faults and in the layers with collecting properties takes place. Under the majority of the hydrocarbon deposits, at a certain depth, magmatic centers and deep faults are recorded. The investigation results of the serpentinized rocks with numerous geological-geophysical factual data allow understanding that hydrocarbons are mainly formed in both the offshore part of the ocean and at different depths of the continental crust. Experiments have also shown that the dehydration of the serpentinized rocks is accompanied by an explosion with the instantaneous increase in pressure and temperature and smelting the studied rocks. According to numerous publications, hydrocarbons and diamonds are formed in the upper part of the mantle, at the depths of 200-400km, and as a consequence of geodynamic processes, they rise to the upper horizons of the Earth crust through narrow channels. However, the genesis of metamorphogenic diamonds and the diamonds found in the lava streams formed within the Earth crust, remains unclear. As at dehydration, super high pressures and temperatures arise. It is assumed that diamond crystals are formed from carbon containing components present in the dehydration zone. It can be assumed that besides the explosion at dehydration, secondary explosions of the released hydrogen take place. The process is naturally accompanied by seismic phenomena, causing earthquakes of different magnitudes on the surface. As for the diamondiferous kimberlites, it is well-known that the majority of them are located within the ancient shield and platforms not obligatorily connected with the deep faults. The kimberlites are formed at the shallow location of dehydrated masses in the Earth crust. Kimberlites are younger in respect of containing ancient rocks containing serpentinized bazites and ultrbazites of relicts of the paleooceanic crust. Sometimes, diamonds containing water and hydrocarbons showing their simultaneous genesis are found. So, the geofluids, hydrocarbons and diamonds, according to the new concept put forward, are formed simultaneously from serpentinized rocks as a consequence of their dehydration at different depths of the Earth crust. Based on the concept proposed by us, we suggest discussing the following: -Genesis of gigantic hydrocarbon deposits located in the offshore area of oceans (North American, Mexican Gulf, Cuanza-Kamerunian, East Brazilian etc.) as well as in the continental parts of different mainlands (Kanadian-Arctic Caspian, East Siberian etc.) - Genesis of metamorphogenic diamonds and diamonds in the lava streams (Guinea-Liberian, Kokchetav, Kanadian, Kamchatka-Tolbachinian, etc.).Keywords: dehydration, diamonds, hydrocarbons, serpentinites
Procedia PDF Downloads 3407127 Auteur 3D Filmmaking: From Hitchcock’s Protrusion Technique to Godard’s Immersion Aesthetic
Authors: Delia Enyedi
Abstract:
Throughout film history, the regular return of 3D cinema has been discussed in connection to crises caused by the advent of television or the competition of the Internet. In addition, the three waves of stereoscopic 3D (from 1952 up to 1983) and its current digital version have been blamed for adding a challenging technical distraction to the viewing experience. By discussing the films Dial M for Murder (1954) and Goodbye to Language (2014), the paper aims to analyze the response of recognized auteurs to the use of 3D techniques in filmmaking. For Alfred Hitchcock, the solution to attaining perceptual immersion paradoxically resided in restraining the signature effect of 3D, namely protrusion. In Jean-Luc Godard’s vision, 3D techniques allowed him to explore perceptual absorption by means of depth of field, for which he had long advocated as being central to cinema. Thus, both directors contribute to the foundation of an auteur aesthetic in 3D filmmaking.Keywords: Alfred Hitchcock, authorship, 3D filmmaking, Jean-Luc Godard, perceptual absorption, perceptual immersion
Procedia PDF Downloads 2907126 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques
Authors: Jonathan Iworiso
Abstract:
Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains
Procedia PDF Downloads 1077125 Determination of Weathering at Kilistra Ancient City by Using Non-Destructive Techniques, Central Anatolia, Turkey
Authors: İsmail İnce, Osman Günaydin, Fatma Özer
Abstract:
Stones used in the construction of historical structures are exposed to various direct or indirect atmospheric effects depending on climatic conditions. Building stones deteriorate partially or fully as a result of this exposure. The historic structures are important symbols of any cultural heritage. Therefore, it is important to protect and restore these historical structures. The aim of this study is to determine the weathering conditions at the Kilistra ancient city. It is located in the southwest of the Konya city, Central Anatolia, and was built by carving into pyroclastic rocks during the Byzantine Era. For this purpose, the petrographic and mechanical properties of the pyroclastic rocks were determined. In the assessment of weathering of structures in the ancient city, in-situ non-destructive testing (i.e., Schmidt hardness rebound value, relative humidity measurement) methods were applied.Keywords: cultural heritage, Kilistra ancient city, non-destructive techniques, weathering
Procedia PDF Downloads 3607124 Neutral Heavy Scalar Searches via Standard Model Gauge Boson Decays at the Large Hadron Electron Collider with Multivariate Techniques
Authors: Luigi Delle Rose, Oliver Fischer, Ahmed Hammad
Abstract:
In this article, we study the prospects of the proposed Large Hadron electron Collider (LHeC) in the search for heavy neutral scalar particles. We consider a minimal model with one additional complex scalar singlet that interacts with the Standard Model (SM) via mixing with the Higgs doublet, giving rise to an SM-like Higgs boson and a heavy scalar particle. Both scalar particles are produced via vector boson fusion and can be tested via their decays into pairs of SM particles, analogously to the SM Higgs boson. Using multivariate techniques, we show that the LHeC is sensitive to heavy scalars with masses between 200 and 800 GeV down to scalar mixing of order 0.01.Keywords: beyond the standard model, large hadron electron collider, multivariate analysis, scalar singlet
Procedia PDF Downloads 1377123 Identifying the Faces of colonialism: An Analysis of Gender Inequalities in Economic Participation in Pakistan through Postcolonial Feminist Lens
Authors: Umbreen Salim, Anila Noor
Abstract:
This paper analyses the influences and faces of colonialism in women’s participation in economic activity in postcolonial Pakistan, through postcolonial feminist economic lens. It is an attempt to probe the shifts in gender inequalities that have existed in three stages; pre-colonial, colonial, and postcolonial times in the Indo-Pak subcontinent. It delves into an inquiry of pre-colonial as it is imperative to understand the situation and context before colonisation in order to assess the deviations associated with its onset. Hence, in order to trace gender inequalities this paper analyses from Mughal Era (1526-1757) that existed before British colonisation, then, the gender inequalities that existed during British colonisation (1857- 1947) and the associated dynamics and changes in women’s vulnerabilities to participate in the economy are examined. Followed by, the postcolonial (1947 onwards) scenario of discriminations and oppressions faced by women. As part of the research methodology, primary and secondary data analysis was done. Analysis of secondary data including literary works and photographs was carried out, followed by primary data collection using ethnographic approaches and participatory tools to understand the presence of coloniality and gender inequalities embedded in the social structure through participant’s real-life stories. The data is analysed using feminist postcolonial analysis. Intersectionality has been a key tool of analysis as the paper delved into the gender inequalities through the class and caste lens briefly touching at religion. It is imperative to mention the significance of the study and very importantly the practical challenges as historical analysis of 18th and 19th century is involved. Most of the available work on history is produced by a) men and b) foreigners and mostly white authors. Since the historical analysis is mostly by men the gender analysis presented misses on many aspects of women’s issues and since the authors have been mostly white European gives it as Mohanty says, ‘under western eyes’ perspective. Whereas the edge of this paper is the authors’ deep attachment, belongingness as lived reality and work with women in Pakistan as postcolonial subjects, a better position to relate with the social reality and understand the phenomenon. The study brought some key results as gender inequalities existed before colonisation when women were hidden wheel of stable economy which was completely invisible. During the British colonisation, the vulnerabilities of women only increased and as compared to men their inferiority status further strengthened. Today, the postcolonial woman lives in deep-rooted effects of coloniality where she is divided in class and position within the class, and she has to face gender inequalities within household and in the market for economic participation. Gender inequalities have existed in pre-colonial, during colonisation and postcolonial times in Pakistan with varying dynamics, degrees and intensities for women whereby social class, caste and religion have been key factors defining the extent of discrimination and oppression. Colonialism may have physically ended but the coloniality remains and has its deep, broad and wide effects in increasing gender inequalities in women’s participation in the economy in Pakistan.Keywords: colonialism, economic participation, gender inequalities, women
Procedia PDF Downloads 2087122 Attack Redirection and Detection using Honeypots
Authors: Chowduru Ramachandra Sharma, Shatunjay Rawat
Abstract:
A false positive state is when the IDS/IPS identifies an activity as an attack, but the activity is acceptable behavior in the system. False positives in a Network Intrusion Detection System ( NIDS ) is an issue because they desensitize the administrator. It wastes computational power and valuable resources when rules are not tuned properly, which is the main issue with anomaly NIDS. Furthermore, most false positives reduction techniques are not performed during the real-time of attempted intrusions; instead, they have applied afterward on collected traffic data and generate alerts. Of course, false positives detection in ‘offline mode’ is tremendously valuable. Nevertheless, there is room for improvement here; automated techniques still need to reduce False Positives in real-time. This paper uses the Snort signature detection model to redirect the alerted attacks to Honeypots and verify attacks.Keywords: honeypot, TPOT, snort, NIDS, honeybird, iptables, netfilter, redirection, attack detection, docker, snare, tanner
Procedia PDF Downloads 1567121 Study on Control Techniques for Adaptive Impact Mitigation
Authors: Rami Faraj, Cezary Graczykowski, Błażej Popławski, Grzegorz Mikułowski, Rafał Wiszowaty
Abstract:
Progress in the field of sensors, electronics and computing results in more and more often applications of adaptive techniques for dynamic response mitigation. When it comes to systems excited with mechanical impacts, the control system has to take into account the significant limitations of actuators responsible for system adaptation. The paper provides a comprehensive discussion of the problem of appropriate design and implementation of adaptation techniques and mechanisms. Two case studies are presented in order to compare completely different adaptation schemes. The first example concerns a double-chamber pneumatic shock absorber with a fast piezo-electric valve and parameters corresponding to the suspension of a small unmanned aerial vehicle, whereas the second considered system is a safety air cushion applied for evacuation of people from heights during a fire. For both systems, it is possible to ensure adaptive performance, but a realization of the system’s adaptation is completely different. The reason for this is technical limitations corresponding to specific types of shock-absorbing devices and their parameters. Impact mitigation using a pneumatic shock absorber corresponds to much higher pressures and small mass flow rates, which can be achieved with minimal change of valve opening. In turn, mass flow rates in safety air cushions relate to gas release areas counted in thousands of sq. cm. Because of these facts, both shock-absorbing systems are controlled based on completely different approaches. Pneumatic shock-absorber takes advantage of real-time control with valve opening recalculated at least every millisecond. In contrast, safety air cushion is controlled using the semi-passive technique, where adaptation is provided using prediction of the entire impact mitigation process. Similarities of both approaches, including applied models, algorithms and equipment, are discussed. The entire study is supported by numerical simulations and experimental tests, which prove the effectiveness of both adaptive impact mitigation techniques.Keywords: adaptive control, adaptive system, impact mitigation, pneumatic system, shock-absorber
Procedia PDF Downloads 907120 Healthcare-SignNet: Advanced Video Classification for Medical Sign Language Recognition Using CNN and RNN Models
Authors: Chithra A. V., Somoshree Datta, Sandeep Nithyanandan
Abstract:
Sign Language Recognition (SLR) is the process of interpreting and translating sign language into spoken or written language using technological systems. It involves recognizing hand gestures, facial expressions, and body movements that makeup sign language communication. The primary goal of SLR is to facilitate communication between hearing- and speech-impaired communities and those who do not understand sign language. Due to the increased awareness and greater recognition of the rights and needs of the hearing- and speech-impaired community, sign language recognition has gained significant importance over the past 10 years. Technological advancements in the fields of Artificial Intelligence and Machine Learning have made it more practical and feasible to create accurate SLR systems. This paper presents a distinct approach to SLR by framing it as a video classification problem using Deep Learning (DL), whereby a combination of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) has been used. This research targets the integration of sign language recognition into healthcare settings, aiming to improve communication between medical professionals and patients with hearing impairments. The spatial features from each video frame are extracted using a CNN, which captures essential elements such as hand shapes, movements, and facial expressions. These features are then fed into an RNN network that learns the temporal dependencies and patterns inherent in sign language sequences. The INCLUDE dataset has been enhanced with more videos from the healthcare domain and the model is evaluated on the same. Our model achieves 91% accuracy, representing state-of-the-art performance in this domain. The results highlight the effectiveness of treating SLR as a video classification task with the CNN-RNN architecture. This approach not only improves recognition accuracy but also offers a scalable solution for real-time SLR applications, significantly advancing the field of accessible communication technologies.Keywords: sign language recognition, deep learning, convolution neural network, recurrent neural network
Procedia PDF Downloads 277119 Optimization Techniques for Microwave Structures
Authors: Malika Ourabia
Abstract:
A new and efficient method is presented for the analysis of arbitrarily shaped discontinuities. The discontinuities is characterized using a hybrid spectral/numerical technique. This structure presents an arbitrary number of ports, each one with different orientation and dimensions. This article presents a hybrid method based on multimode contour integral and mode matching techniques. The process is based on segmentation and dividing the structure into key building blocks. We use the multimode contour integral method to analyze the blocks including irregular shape discontinuities. Finally, the multimode scattering matrix of the whole structure can be found by cascading the blocks. Therefore, the new method is suitable for analysis of a wide range of waveguide problems. Therefore, the present approach can be applied easily to the analysis of any multiport junctions and cascade blocks. The accuracy of the method is validated comparing with results for several complex problems found in the literature. CPU times are also included to show the efficiency of the new method proposed.Keywords: segmentation, s parameters, simulation, optimization
Procedia PDF Downloads 5287118 Quantitative Elemental Analysis of Cyperus rotundus Medicinal Plant by Particle Induced X-Ray Emission and ICP-MS Techniques
Authors: J. Chandrasekhar Rao, B. G. Naidu, G. J. Naga Raju, P. Sarita
Abstract:
Particle Induced X-ray Emission (PIXE) and Inductively Coupled Plasma Mass Spectroscopy (ICP-MS) techniques have been employed in this work to determine the elements present in the root of Cyperus rotundus medicinal plant used in the treatment of rheumatoid arthritis. The elements V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Rb, and Sr were commonly identified and quantified by both PIXE and ICP-MS whereas the elements Li, Be, Al, As, Se, Ag, Cd, Ba, Tl, Pb and U were determined by ICP-MS and Cl, K, Ca, Ti and Br were determined by PIXE. The regional variation of elemental content has also been studied by analyzing the same plant collected from different geographical locations. Information on the elemental content of the medicinal plant would be helpful in correlating its ability in the treatment of rheumatoid arthritis and also in deciding the dosage of this herbal medicine from the metal toxicity point of view. Principal component analysis and cluster analysis were also applied to the data matrix to understand the correlation among the elements.Keywords: PIXE, CP-MS, elements, Cyperus rotundus, rheumatoid arthritis
Procedia PDF Downloads 3337117 Reimagining Landscapes: Psychological Responses and Behavioral Shifts in the Aftermath of the Lytton Creek Fire
Authors: Tugba Altin
Abstract:
In an era where the impacts of climate change resonate more pronouncedly than ever, communities globally grapple with events bearing both tangible and intangible ramifications. Situating this within the evolving landscapes of Psychological and Behavioral Sciences, this research probes the profound psychological and behavioral responses evoked by such events. The Lytton Creek Fire of 2021 epitomizes these challenges. While tangible destruction is immediate and evident, the intangible repercussions—emotional distress, disintegration of cultural landscapes, and disruptions in place attachment (PA)—require meticulous exploration. PA, emblematic of the emotional and cognitive affiliations individuals nurture with their environments, emerges as a cornerstone for comprehending how environmental cataclysms influence cultural identity and bonds to land. This study, harmonizing the core tenets of an interpretive phenomenological approach with a hermeneutic framework, underscores the pivotal nature of this attachment. It delves deep into the realm of individuals' experiences post the Lytton Creek Fire, unraveling the intricate dynamics of PA amidst such calamity. The study's methodology deviates from conventional paradigms. Instead of traditional interview techniques, it employs walking audio sessions and photo elicitation methods, granting participants the agency to immerse, re-experience, and vocalize their sentiments in real-time. Such techniques shed light on spatial narratives post-trauma and capture the otherwise elusive emotional nuances, offering a visually rich representation of place-based experiences. Central to this research is the voice of the affected populace, whose lived experiences and testimonies form the nucleus of the inquiry. As they renegotiate their bonds with transformed environments, their narratives reveal the indispensable role of cultural landscapes in forging place-based identities. Such revelations accentuate the necessity of integrating both tangible and intangible trauma facets into community recovery strategies, ensuring they resonate more profoundly with affected individuals. Bridging the domains of environmental psychology and behavioral sciences, this research accentuates the intertwined nature of tangible restoration with the imperative of emotional and cultural recuperation post-environmental disasters. It advocates for adaptation initiatives that are rooted in the lived realities of the affected, emphasizing a holistic approach that recognizes the profundity of human connections to landscapes. This research advocates the interdisciplinary exchange of ideas and strategies in addressing post-disaster community recovery strategies. It not only enriches the climate change discourse by emphasizing the human facets of disasters but also reiterates the significance of an interdisciplinary approach, encompassing psychological and behavioral nuances, for fostering a comprehensive understanding of climate-induced traumas. Such a perspective is indispensable for shaping more informed, empathetic, and effective adaptation strategies.Keywords: place attachment, community recovery, disaster response, restorative landscapes, sensory response, visual methodologies
Procedia PDF Downloads 597116 Automatic Adult Age Estimation Using Deep Learning of the ResNeXt Model Based on CT Reconstruction Images of the Costal Cartilage
Authors: Ting Lu, Ya-Ru Diao, Fei Fan, Ye Xue, Lei Shi, Xian-e Tang, Meng-jun Zhan, Zhen-hua Deng
Abstract:
Accurate adult age estimation (AAE) is a significant and challenging task in forensic and archeology fields. Attempts have been made to explore optimal adult age metrics, and the rib is considered a potential age marker. The traditional way is to extract age-related features designed by experts from macroscopic or radiological images followed by classification or regression analysis. Those results still have not met the high-level requirements for practice, and the limitation of using feature design and manual extraction methods is loss of information since the features are likely not designed explicitly for extracting information relevant to age. Deep learning (DL) has recently garnered much interest in imaging learning and computer vision. It enables learning features that are important without a prior bias or hypothesis and could be supportive of AAE. This study aimed to develop DL models for AAE based on CT images and compare their performance to the manual visual scoring method. Chest CT data were reconstructed using volume rendering (VR). Retrospective data of 2500 patients aged 20.00-69.99 years were obtained between December 2019 and September 2021. Five-fold cross-validation was performed, and datasets were randomly split into training and validation sets in a 4:1 ratio for each fold. Before feeding the inputs into networks, all images were augmented with random rotation and vertical flip, normalized, and resized to 224×224 pixels. ResNeXt was chosen as the DL baseline due to its advantages of higher efficiency and accuracy in image classification. Mean absolute error (MAE) was the primary parameter. Independent data from 100 patients acquired between March and April 2022 were used as a test set. The manual method completely followed the prior study, which reported the lowest MAEs (5.31 in males and 6.72 in females) among similar studies. CT data and VR images were used. The radiation density of the first costal cartilage was recorded using CT data on the workstation. The osseous and calcified projections of the 1 to 7 costal cartilages were scored based on VR images using an eight-stage staging technique. According to the results of the prior study, the optimal models were the decision tree regression model in males and the stepwise multiple linear regression equation in females. Predicted ages of the test set were calculated separately using different models by sex. A total of 2600 patients (training and validation sets, mean age=45.19 years±14.20 [SD]; test set, mean age=46.57±9.66) were evaluated in this study. Of ResNeXt model training, MAEs were obtained with 3.95 in males and 3.65 in females. Based on the test set, DL achieved MAEs of 4.05 in males and 4.54 in females, which were far better than the MAEs of 8.90 and 6.42 respectively, for the manual method. Those results showed that the DL of the ResNeXt model outperformed the manual method in AAE based on CT reconstruction of the costal cartilage and the developed system may be a supportive tool for AAE.Keywords: forensic anthropology, age determination by the skeleton, costal cartilage, CT, deep learning
Procedia PDF Downloads 737115 A Review on Modeling and Optimization of Integration of Renewable Energy Resources (RER) for Minimum Energy Cost, Minimum CO₂ Emissions and Sustainable Development, in Recent Years
Authors: M. M. Wagh, V. V. Kulkarni
Abstract:
The rising economic activities, growing population and improving living standards of world have led to a steady growth in its appetite for quality and quantity of energy services. As the economy expands the electricity demand is going to grow further, increasing the challenges of the more generation and stresses on the utility grids. Appropriate energy model will help in proper utilization of the locally available renewable energy sources such as solar, wind, biomass, small hydro etc. to integrate in the available grid, reducing the investments in energy infrastructure. Further to these new technologies like smart grids, decentralized energy planning, energy management practices, energy efficiency are emerging. In this paper, the attempt has been made to study and review the recent energy planning models, energy forecasting models, and renewable energy integration models. In addition, various modeling techniques and tools are reviewed and discussed.Keywords: energy modeling, integration of renewable energy, energy modeling tools, energy modeling techniques
Procedia PDF Downloads 3457114 Searching for the ‘Why’ of Gendered News: Journalism Practices and Societal Contexts
Authors: R. Simões, M. Silveirinha
Abstract:
Driven by the need to understand the results of previous research that clearly shows deep unbalances of the media discourses about women and men in spite of the growing numbers of female journalists, our paper aims to progress from the 'what' to the 'why' of these unbalanced representations. Furthermore, it does so at a time when journalism is undergoing a dramatic change in terms of professional practices and in how media organizations are organized and run, affecting women in particular. While some feminist research points to the fact that female and male journalists evaluate the role of the news and production methods in similar ways feminist theorizing also suggests that thought and knowledge are highly influenced by social identity, which is also inherently affected by the experiences of gender. This is particularly important at a time of deep societal and professional changes. While there are persuasive discussions of gender identities at work in newsrooms in various countries studies on the issue will benefit from cases that focus on the particularities of local contexts. In our paper, we present one such case: the case of Portugal, a country hit hard by austerity measures that have affected all cultural industries including journalism organizations, already feeling the broader impacts of the larger societal changes of the media landscape. Can we gender these changes? How are they felt and understood by female and male journalists? And how are these discourses framed by androcentric, feminist and post-feminist sensibilities? Foregrounding questions of gender, our paper seeks to explore some of the interactions of societal and professional forces, identifying their gendered character and outlining how they shape journalism work in general and the production of unbalanced gender representations in particular. We do so grounded in feminist studies of journalism as well as feminist organizational and work studies, looking at a corpus of 20 in-depth interviews of female and male Portuguese journalists. The research findings illustrate how gender in journalism practices interacts with broader experiences of the cultural and economic contexts and show the ambivalences of these interactions in news organizations.Keywords: gender, journalism, newsroom culture, Portuguese journalists
Procedia PDF Downloads 3997113 Load Balancing Technique for Energy - Efficiency in Cloud Computing
Authors: Rani Danavath, V. B. Narsimha
Abstract:
Cloud computing is emerging as a new paradigm of large scale distributed computing. Cloud computing is a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., three service models, and four deployment networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model is composed of five essential characteristics models. Load balancing is one of the main challenges in cloud computing, which is required to distribute the dynamic workload across multiple nodes, to ensure that no single node is overloaded. It helps in optimal utilization of resources, enhancing the performance of the system. The goal of the load balancing is to minimize the resource consumption and carbon emission rate, that is the direct need of cloud computing. This determined the need of new metrics energy consumption and carbon emission for energy-efficiency load balancing techniques in cloud computing. Existing load balancing techniques mainly focuses on reducing overhead, services, response time and improving performance etc. In this paper we introduced a Technique for energy-efficiency, but none of the techniques have considered the energy consumption and carbon emission. Therefore, our proposed work will go towards energy – efficiency. So this energy-efficiency load balancing technique can be used to improve the performance of cloud computing by balancing the workload across all the nodes in the cloud with the minimum resource utilization, in turn, reducing energy consumption, and carbon emission to an extent, which will help to achieve green computing.Keywords: cloud computing, distributed computing, energy efficiency, green computing, load balancing, energy consumption, carbon emission
Procedia PDF Downloads 4497112 Health of Riveted Joints with Active and Passive Structural Health Monitoring Techniques
Authors: Javad Yarmahmoudi, Alireza Mirzaee
Abstract:
Many active and passive structural health monitoring (SHM) techniques have been developed for detection of the defects of plates. Generally, riveted joints hold the plates together and their failure may create accidents. In this study, well known active and passive methods were modified for the evaluation of the health of the riveted joints between the plates. The active method generated Lamb waves and monitored their propagation by using lead zirconate titanate (PZT) disks. The signal was analyzed by using the wavelet transformations. The passive method used the Fiber Bragg Grating (FBG) sensors and evaluated the spectral characteristics of the signals by using Fast Fourier Transformation (FFT). The results indicated that the existing methods designed for the evaluation of the health of individual plates may be used for inspection of riveted joints with software modifications.Keywords: structural health monitoring, SHM, active SHM, passive SHM, fiber bragg grating sensor, lead zirconate titanate, PZT
Procedia PDF Downloads 3277111 Heritage Buildings an Inspiration for Energy Conservation under Solar Control – a Case Study of Hadoti Region of India.
Authors: Abhinav Chaturvedi, Joohi Chaturvedi, Renu Chaturvedi
Abstract:
With rapid urbanization and growth of population, more buildings are require to be constructed to meet the increasing demand of the shelter. 80 % of the world population is living in developing countries, but the adequate energy supplied to only 30% of it. In India situation get little more difficult as majority of the villages of India are still deprived of energy. 1/3 of the Indian household does not have energy supply. So there is big gap between energy demand and supply. Moreover India is producing around 65 % of the energy from Non – Renewable sources and 25 % of the Energy is imported in the form of oil and gas and only 10% of the total, is generated from other sources like solar power, wind power etc. Present modern structures are big energy consumers as they are consuming 40 % of the total energy in providing comfort conditions to the users, in from of heating and cooling,5 % in Building Construction, 20 % in transportation and 20 % in industrial process and 10 % in other processes. If we minimize this Heating and Cooling and lighting load of the building we can conserve huge amount of energy for the future. In history, buildings do not have artificial systems of cooling or heating. These buildings, especially in Hadoti Region which have Semi Arid Climatic conditions, are provided with Solar Passive Design Techniques that is the reason of comfort inside the buildings. So if we use some appropriate elements of these heritage structures, in our present age building design we can find some certain solution to energy crises. Present paper describes Various Solar Passive design techniques used in past, and the same could be used in present to reduce the consumption of energy.Keywords: energy conservation, Hadoti region, solar passive design techniques , semi - arid climatic condition
Procedia PDF Downloads 4757110 Techniques of Construction Management in Civil Engineering
Authors: Mamoon M. Atout
Abstract:
The Middle East Gulf region has witnessed rapid growth and development in many areas over the last two decades. The development of the real-estate sector, construction industry and infrastructure projects are a major share of the development that has participated in the civilization of the countries of the Gulf. Construction industry projects were planned and managed by different types of experts, who came from all over the world having different types of experiences in construction management and industry. Some of these projects were completed on time, while many were not, due to many accumulating factors. Many accumulated factors are considered as the principle reason for the problem experienced at the project construction stage, which reflected negatively on the project success. Specific causes of delay have been identified by construction managers to avoid any unexpected delays through proper analysis and considerations to some implications such as risk assessment and analysis for many potential problems to ensure that projects will be delivered on time. Construction management implications were adopted and considered by project managers who have experience and knowledge in applying the techniques of the system of engineering construction management. The aim of this research is to determine the benefits of the implications of construction management by the construction team and level of considerations of the techniques and processes during the project development and construction phases to avoid any delay in the projects. It also aims to determine the factors that participate to project completion delays in case project managers are not well committed to their roles and responsibilities. The results of the analysis will determine the necessity of the applications required by the project team to avoid the causes of delays that help them deliver projects on time, e.g. verifying tender documents, quantities and preparing the construction method of the project.Keywords: construction management, control process, cost control, planning and scheduling
Procedia PDF Downloads 2477109 The Geometry of Natural Formation: an Application of Geometrical Analysis for Complex Natural Order of Pomegranate
Authors: Anahita Aris
Abstract:
Geometry always plays a key role in natural structures, which can be a source of inspiration for architects and urban designers to create spaces. By understanding formative principles in nature, a variety of options can be provided that lead to freedom of formation. The main purpose of this paper is to analyze the geometrical order found in pomegranate to find formative principles explaining its complex structure. The point is how spherical arils of pomegranate pressed together inside the fruit and filled the space as they expand in the growing process, which made a self-organized system leads to the formation of each of the arils are unique in size, topology and shape. The main challenge of this paper would be using advanced architectural modeling techniques to discover these principles.Keywords: advanced modeling techniques, architectural modeling, computational design, the geometry of natural formation, geometrical analysis, the natural order of pomegranate, voronoi diagrams
Procedia PDF Downloads 2207108 Algorithms used in Spatial Data Mining GIS
Authors: Vahid Bairami Rad
Abstract:
Extracting knowledge from spatial data like GIS data is important to reduce the data and extract information. Therefore, the development of new techniques and tools that support the human in transforming data into useful knowledge has been the focus of the relatively new and interdisciplinary research area ‘knowledge discovery in databases’. Thus, we introduce a set of database primitives or basic operations for spatial data mining which are sufficient to express most of the spatial data mining algorithms from the literature. This approach has several advantages. Similar to the relational standard language SQL, the use of standard primitives will speed-up the development of new data mining algorithms and will also make them more portable. We introduced a database-oriented framework for spatial data mining which is based on the concepts of neighborhood graphs and paths. A small set of basic operations on these graphs and paths were defined as database primitives for spatial data mining. Furthermore, techniques to efficiently support the database primitives by a commercial DBMS were presented.Keywords: spatial data base, knowledge discovery database, data mining, spatial relationship, predictive data mining
Procedia PDF Downloads 4607107 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 4697106 A Comprehensive Study on Quality Assurance in Game Development
Authors: Maria Komal, Zaineb Khalil, Mehreen Sirshar
Abstract:
Due to the recent technological advancements, Games have become one of the most demanding applications. Gaming industry is rapidly growing and the key to success in this industry is the development of good quality games, which is a highly competitive issue. The ultimate goal of game developers is to provide player’s satisfaction by developing high-quality games. This research is the comprehensive survey of techniques followed by game industries to ensure games quality. After analysis of various techniques, it has been found that quality simulation according to ISO standards and play test methods are used to ensure games quality. Because game development requires cross-disciplined team, an increasing trend towards distributed game development has been observed. This paper evaluates the strengths and weaknesses of current methodologies used in game industry and draws a conclusion. We have also proposed quality parameters which can be used as a heuristic framework to identify those attributes which have high testing priorities.Keywords: game development, computer games, video games, gaming industry, quality assurance, playability, user experience
Procedia PDF Downloads 5347105 Deterioration Prediction of Pavement Load Bearing Capacity from FWD Data
Authors: Kotaro Sasai, Daijiro Mizutani, Kiyoyuki Kaito
Abstract:
Expressways in Japan have been built in an accelerating manner since the 1960s with the aid of rapid economic growth. About 40 percent in length of expressways in Japan is now 30 years and older and has become superannuated. Time-related deterioration has therefore reached to a degree that administrators, from a standpoint of operation and maintenance, are forced to take prompt measures on a large scale aiming at repairing inner damage deep in pavements. These measures have already been performed for bridge management in Japan and are also expected to be embodied for pavement management. Thus, planning methods for the measures are increasingly demanded. Deterioration of layers around road surface such as surface course and binder course is brought about at the early stages of whole pavement deterioration process, around 10 to 30 years after construction. These layers have been repaired primarily because inner damage usually becomes significant after outer damage, and because surveys for measuring inner damage such as Falling Weight Deflectometer (FWD) survey and open-cut survey are costly and time-consuming process, which has made it difficult for administrators to focus on inner damage as much as they have been supposed to. As expressways today have serious time-related deterioration within them deriving from the long time span since they started to be used, it is obvious the idea of repairing layers deep in pavements such as base course and subgrade must be taken into consideration when planning maintenance on a large scale. This sort of maintenance requires precisely predicting degrees of deterioration as well as grasping the present situations of pavements. Methods for predicting deterioration are determined to be either mechanical or statistical. While few mechanical models have been presented, as far as the authors know of, previous studies have presented statistical methods for predicting deterioration in pavements. One describes deterioration process by estimating Markov deterioration hazard model, while another study illustrates it by estimating Proportional deterioration hazard model. Both of the studies analyze deflection data obtained from FWD surveys and present statistical methods for predicting deterioration process of layers around road surface. However, layers of base course and subgrade remain unanalyzed. In this study, data collected from FWD surveys are analyzed to predict deterioration process of layers deep in pavements in addition to surface layers by a means of estimating a deterioration hazard model using continuous indexes. This model can prevent the loss of information of data when setting rating categories in Markov deterioration hazard model when evaluating degrees of deterioration in roadbeds and subgrades. As a result of portraying continuous indexes, the model can predict deterioration in each layer of pavements and evaluate it quantitatively. Additionally, as the model can also depict probability distribution of the indexes at an arbitrary point and establish a risk control level arbitrarily, it is expected that this study will provide knowledge like life cycle cost and informative content during decision making process referring to where to do maintenance on as well as when.Keywords: deterioration hazard model, falling weight deflectometer, inner damage, load bearing capacity, pavement
Procedia PDF Downloads 3907104 Self-Assembly of Monodisperse Oleic Acid-Capped Superparamagnetic Iron Oxide Nanoparticles
Authors: Huseyin Kavas
Abstract:
Oleic acid (OA) capped superparamagnetic iron oxide nanoparticles (SPION) were synthesized by a thermal decomposition method. The composition of nanoparticles was confirmed by X-ray powder diffraction, and the morphology of particles was investigated by Atomic Force Microscopy (AFM), Scanning Electron Microscopy (SEM), and Transmission electron microscopy (TEM). The crystalline and particle size distribution of SPIONS capped with OA were investigated with a mean size of 6.99 nm and 8.9 nm, respectively. It was found that SPIONS have superparamagnetic characteristics with a saturation magnetization value of 64 emu/g. The thin film form of self-assembled SPIONS was fabricated by coating techniques of spin coating and dip coating. SQUID-VSM magnetometer and FMR techniques were performed in order to evaluate the magnetic properties of thin films, especially the existence of magnetic anisotropy. The thin films with magnetic anisotropy were obtained by self-assembled monolayers of SPION.Keywords: magnetic materials, nanostructures, self-assembly, FMR
Procedia PDF Downloads 1077103 Preservation of Sensitive Biological Products: An Insight into Conventional and Upcoming Drying Techniques
Authors: Jannika Dombrowski, Sabine Ambros, Ulrich Kulozik
Abstract:
Several drying techniques are used to preserve sensitive substances such as probiotic lactic acid bacteria. With the aim to better understand differences between these processes, this work gives new insights into structural variations resulting from different preservation methods and their impact on product quality and storage stability. Industrially established methods (freeze drying, spray drying) were compared to upcoming vacuum, microwave-freeze, and microwave-vacuum drying. For freeze and microwave-freeze dried samples, survival and activity maintained 100%, whereas vacuum and microwave-vacuum dried cultures achieved 30-40% survival. Spray drying yielded in lowest viability. The results are directly related to temperature and oxygen content during drying. Interestingly, most storage stable products resulted from vacuum and microwave-vacuum drying due to denser product structures as determined by helium pycnometry and SEM images. Further, lower water adsorption velocities were responsible for lower inactivation rates. Concluding, resulting product structures as well as survival rates and storage stability mainly depend on the type of water removal instead of energy input. Microwave energy compared to conductive heating did not lead to significant differences regarding the examined factors. Correlations could be proven for three investigated microbial strains. The presentation will be completed by an overview on the energy efficiency of the presented methods.Keywords: drying techniques, energy efficiency, lactic acid bacteria, probiotics, survival rates, structure characterization
Procedia PDF Downloads 2397102 Adversary Emulation: Implementation of Automated Countermeasure in CALDERA Framework
Authors: Yinan Cao, Francine Herrmann
Abstract:
Adversary emulation is a very effective concrete way to evaluate the defense of an information system or network. It is about building an emulator, which depending on the vulnerability of a target system, will allow to detect and execute a set of identified attacks. However, emulating an adversary is very costly in terms of time and resources. Verifying the information of each technique and building up the countermeasures in the middle of the test is also needed to be accomplished manually. In this article, a synthesis of previous MITRE research on the creation of the ATT&CK matrix will be as the knowledge base of the known techniques and a well-designed adversary emulation software CALDERA based on ATT&CK Matrix will be used as our platform. Inspired and guided by the previous study, a plugin in CALDERA called Tinker will be implemented, which is aiming to help the tester to get more information and also the mitigation of each technique used in the previous operation. Furthermore, the optional countermeasures for some techniques are also implemented and preset in Tinker in order to facilitate and fasten the process of the defense improvement of the tested system.Keywords: automation, adversary emulation, CALDERA, countermeasures, MITRE ATT&CK
Procedia PDF Downloads 2087101 Extraction of Natural Colorant from the Flowers of Flame of Forest Using Ultrasound
Authors: Sunny Arora, Meghal A. Desai
Abstract:
An impetus towards green consumerism and implementation of sustainable techniques, consumption of natural products and utilization of environment friendly techniques have gained accelerated acceptance. Butein, a natural colorant, has many medicinal properties apart from its use in dyeing industries. Extraction of butein from the flowers of flame of forest was carried out using ultrasonication bath. Solid loading (2-6 g), extraction time (30-50 min), volume of solvent (30-50 mL) and types of solvent (methanol, ethanol and water) have been studied to maximize the yield of butein using the Taguchi method. The highest yield of butein 4.67% (w/w) was obtained using 4 g of plant material, 40 min of extraction time and 30 mL volume of methanol as a solvent. The present method provided a greater reduction in extraction time compared to the conventional method of extraction. Hence, the outcome of the present investigation could further be utilized to develop the method at a higher scale.Keywords: butein, flowers of Flame of the Forest, Taguchi method, ultrasonic bath
Procedia PDF Downloads 4757100 Simplified Stress Gradient Method for Stress-Intensity Factor Determination
Authors: Jeries J. Abou-Hanna
Abstract:
Several techniques exist for determining stress-intensity factors in linear elastic fracture mechanics analysis. These techniques are based on analytical, numerical, and empirical approaches that have been well documented in literature and engineering handbooks. However, not all techniques share the same merit. In addition to overly-conservative results, the numerical methods that require extensive computational effort, and those requiring copious user parameters hinder practicing engineers from efficiently evaluating stress-intensity factors. This paper investigates the prospects of reducing the complexity and required variables to determine stress-intensity factors through the utilization of the stress gradient and a weighting function. The heart of this work resides in the understanding that fracture emanating from stress concentration locations cannot be explained by a single maximum stress value approach, but requires use of a critical volume in which the crack exists. In order to understand the effectiveness of this technique, this study investigated components of different notch geometry and varying levels of stress gradients. Two forms of weighting functions were employed to determine stress-intensity factors and results were compared to analytical exact methods. The results indicated that the “exponential” weighting function was superior to the “absolute” weighting function. An error band +/- 10% was met for cases ranging from a steep stress gradient in a sharp v-notch to the less severe stress transitions of a large circular notch. The incorporation of the proposed method has shown to be a worthwhile consideration.Keywords: fracture mechanics, finite element method, stress intensity factor, stress gradient
Procedia PDF Downloads 135