Search results for: monte carlo ray tracing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 599

Search results for: monte carlo ray tracing

209 Re-Presenting the Egyptian Informal Urbanism in Films between 1994 and 2014

Authors: R. Mofeed, N. Elgendy

Abstract:

Cinema constructs mind-spaces that reflect inherent human thoughts and emotions. As a representational art, Cinema would introduce comprehensive images of life phenomena in different ways. The term “represent” suggests verity of meanings; bring into presence, replace or typify. In that sense, Cinema may present a phenomenon through direct embodiment, or introduce a substitute image that replaces the original phenomena, or typify it by relating the produced image to a more general category through a process of abstraction. This research is interested in questioning the type of images that Egyptian Cinema introduces to informal urbanism and how these images were conditioned and reshaped in the last twenty years. The informalities/slums phenomenon first appeared in Egypt and, particularly, Cairo in the early sixties, however, this phenomenon was completely ignored by the state and society until the eighties, and furthermore, its evident representation in Cinema was by the mid-nineties. The Informal City represents the illegal housing developments, and it is a fast growing form of urbanization in Cairo. Yet, this expanding phenomenon is still depicted as the minority, exceptional and marginal through the Cinematic lenses. This paper aims at tracing the forms of representations of the urban informalities in the Egyptian Cinema between 1994 and 2014, and how did that affect the popular mind and its perception of these areas. The paper runs two main lines of inquiry; the first traces the phenomena through a chronological and geographical mapping of the informal urbanism has been portrayed in films. This analysis is based on an academic research work at Cairo University in Fall 2014. The visual tracing through maps and timelines allowed a reading of the phases of ignorance, presence, typifying and repetition in the representation of this huge sector of the city through more than 50 films that has been investigated. The analysis clearly revealed the “portrayed image” of informality by the Cinema through the examined period. However, the second part of the paper explores the “perceived image”. A designed questionnaire is applied to highlight the main features of that image that is perceived by both inhabitants of informalities and other Cairenes based on watching selected films. The questionnaire covers the different images of informalities proposed in the Cinema whether in a comic or a melodramatic background and highlight the descriptive terms used, to see which of them resonate with the mass perceptions and affected their mental images. The two images; “portrayed” and “perceived” are then to be encountered to reflect on issues of repetitions, stereotyping and reality. The formulated stereotype of informal urbanism is finally outlined and justified in relation to both production consumption mechanisms of films and the State official vision of informalities.

Keywords: cinema, informal urbanism, popular mind, representation

Procedia PDF Downloads 296
208 Diachronic Evolution and Multifaceted Interpretation of City-Mountain Landscape Culture: From Ritualistic Divinity to Poetic Aesthetics

Authors: Junjie Fu

Abstract:

This paper explores the cultural evolution of the "city-mountain" landscape in ancient Chinese cities, tracing its origins in the regional mountain and town division within the national system. It delves into the cultural archetype of "city-mountain" landscape divine imagery and its spatial characteristics, drawing from the spatial conception of mountain worship and divine order in the model of Kunlun and Penglai. Furthermore, it examines the shift from religious to daily life influences, leading to a poetic aesthetic turn in the "city-mountain" landscape. The paper also discusses the organizational structure of the "city-mountain" poetic landscape and its role as a space for enjoyment. By studying the cultural connotations, evolving relationships, and power mechanisms of the "city-mountain" landscape, this research provides theoretical insights for the construction and development of "city-mountain" landscapes and mountain cities.

Keywords: city-mountain landscape, cultural image, divinity, landscape image, poetry

Procedia PDF Downloads 86
207 Genealogy of a Building: Tarikhaneh

Authors: Mohadeseh Salari Sardari

Abstract:

As Muslims conquered Iran, their first impression was to show their power over others. They needed mosques for their multiple needs like prayer, tax collecting, law-making, hearing of law cases, and most important of all, as a seat of government. Sometimes they did not have time to build mosques and only began to build them after years of ruling. Many religious buildings with pre-Islamic past survived in Iran, most of the fire temples in cities were destroyed or changed radically, but some deserted temples outside of cities survived, and based on these surviving buildings, we can trace changes in fire temples inside cities and discover how they were adapted and expanded to be mosques. In addition, there are some other buildings with doubts about their date of construction. These buildings might be transitional buildings between two different historical eras or might be an old building with a slight change. One of these interesting buildings is Tarikhaneh, a small, simple yet striking building. By tracing Tarikhaneh’s roots in other buildings like fire temples and secular buildings existed before Arab invasion, it can be better understood how the original form of Tatikhaneh was.

Keywords: iranian architecture, early mosques, fire temples, adaptation and reuse

Procedia PDF Downloads 137
206 The Impact of Artificial Intelligence on E-Learning

Authors: Sameil Hanna Samweil Botros

Abstract:

The variation of social networking websites inside higher training has garnered enormous hobby in recent years, with numerous researchers thinking about it as a possible shift from the conventional lecture room-based learning paradigm. However, this boom in research and carried out research, but the adaption of SNS-based modules has not proliferated inside universities. This paper commences its contribution with the aid of studying the numerous fashions and theories proposed in the literature and amalgamates together various effective aspects for the inclusion of social technology within e-gaining knowledge. A three-phased framework is similarly proposed, which informs the important concerns for the hit edition of SNS in improving the student's mastering experience. This suggestion outlines the theoretical foundations as a way to be analyzed in sensible implementation across worldwide university campuses.

Keywords: eLearning, institutionalization, teaching and learning, transformation vtuber, ray tracing, avatar agriculture, adaptive, e-learning, technology eLearning, higher education, social network sites, student learning

Procedia PDF Downloads 25
205 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study

Authors: Natália Botica, Luís Luís, Paulo Bernardes

Abstract:

The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.

Keywords: rock art, archaeology, iron age, 3D models

Procedia PDF Downloads 83
204 Solving the Set Covering Problem Using the Binary Cat Swarm Optimization Metaheuristic

Authors: Broderick Crawford, Ricardo Soto, Natalia Berrios, Eduardo Olguin

Abstract:

In this paper, we present a binary cat swarm optimization for solving the Set covering problem. The set covering problem is a well-known NP-hard problem with many practical applications, including those involving scheduling, production planning and location problems. Binary cat swarm optimization is a recent swarm metaheuristic technique based on the behavior of discrete cats. Domestic cats show the ability to hunt and are curious about moving objects. The cats have two modes of behavior: seeking mode and tracing mode. We illustrate this approach with 65 instances of the problem from the OR-Library. Moreover, we solve this problem with 40 new binarization techniques and we select the technical with the best results obtained. Finally, we make a comparison between results obtained in previous studies and the new binarization technique, that is, with roulette wheel as transfer function and V3 as discretization technique.

Keywords: binary cat swarm optimization, binarization methods, metaheuristic, set covering problem

Procedia PDF Downloads 396
203 Measurements of Flow Mixing Behaviors Using a Wire-Mesh Sensor in a Wire-Wrapped 37-Pin Rod Assembly

Authors: Hyungmo Kim, Hwang Bae, Seok-Kyu Chang, Dong Won Lee, Yung Joo Ko, Sun Rock Choi, Hae Seob Choi, Hyeon Seok Woo, Dong-Jin Euh, Hyeong-Yeon Lee

Abstract:

Flow mixing characteristics in the wire-wrapped 37-pin rod bundle were measured by using a wire-mesh sensing system for a sodium-cooled fast reactor (SFR). The subchannel flow mixing in SFR core subchannels was an essential characteristic for verification of a core thermal design and safety analysis. A dedicated test facility including the wire-mesh sensor system and tracing liquid injection system was developed, and the conductivity fields at the end of 37-pin rod bundle were visualized in several different flow conditions. These experimental results represented the reasonable agreements with the results of CFD, and the uncertainty of the mixing experiments has been conducted to evaluate the experimental results.

Keywords: core thermal design, flow mixing, a wire-mesh sensor, a wire-wrap effect

Procedia PDF Downloads 629
202 Urban Stratification as a Basis for Analyzing Political Instability: Evidence from Syrian Cities

Authors: Munqeth Othman Agha

Abstract:

The historical formation of urban centres in the eastern Arab world was shaped by rapid urbanization and sudden transformation from the age of the pre-industrial to a post-industrial economy, coupled with uneven development, informal urban expansion, and constant surges in unemployment and poverty rates. The city was stratified accordingly as overlapping layers of division and inequality that have been built on top of each other, creating complex horizontal and vertical divisions based on economic, social, political, and ethno-sectarian basis. This has been further exacerbated during the neoliberal era, which transferred the city into a sort of dual city that is inhabited by heterogeneous and often antagonistic social groups. Economic deprivation combined with a growing sense of marginalization and inequality across the city planted the seeds of political instability, outbreaking in 2011. Unlike other popular uprisings that occupy central squares, as in Egypt and Tunisia, the Syrian uprising in 2011 took place mainly within inner streets and neighborhood squares, mobilizing primarily on more or less upon the lines of stratification. This has emphasized the role of micro-urban and social settings in shaping mobilization and resistance tactics, which necessitates us to understand the way the city was stratified and place it at the center of the city-conflict nexus analysis. This research aims to understand to what extent pre-conflict urban stratification lines played a role in determining the different trajectories of three cities’ neighborhoods (Homs, Dara’a and Deir-ez-Zor). The main argument of the paper is that the way the Syrian city has been stratified creates various social groups within the city who have enjoyed different levels of accessibility to life chances, material resources and social statuses. This determines their relationship with other social groups in the city and, more importantly, their relationship with the state. The advent of a political opportunity will be depicted differently across the city’s different social groups according to their perceived interests and threats, which consequently leads to either political mobilization or demobilization. Several factors, including the type of social structures, built environment, and state response, determine the ability of social actors to transfer the repertoire of contention to collective action or transfer from social actors to political actors. The research uses urban stratification lines as the basis for understanding the different patterns of political upheavals in urban areas while explaining why neighborhoods with different social and urban environment settings had different abilities and capacities to mobilize, resist state repression and then descend into a military conflict. It particularly traces the transformation from social groups to social actors and political actors by applying the Explaining-outcome Process-Tracing method to depict the causal mechanisms that led to including or excluding different neighborhoods from each stage of the uprising, namely mobilization (M1), response (M2), and control (M3).

Keywords: urban stratification, syrian conflict, social movement, process tracing, divided city

Procedia PDF Downloads 73
201 Analytical and Numerical Modeling of Strongly Rotating Rarefied Gas Flows

Authors: S. Pradhan, V. Kumaran

Abstract:

Centrifugal gas separation processes effect separation by utilizing the difference in the mole fraction in a high speed rotating cylinder caused by the difference in molecular mass, and consequently the centrifugal force density. These have been widely used in isotope separation because chemical separation methods cannot be used to separate isotopes of the same chemical species. More recently, centrifugal separation has also been explored for the separation of gases such as carbon dioxide and methane. The efficiency of separation is critically dependent on the secondary flow generated due to temperature gradients at the cylinder wall or due to inserts, and it is important to formulate accurate models for this secondary flow. The widely used Onsager model for secondary flow is restricted to very long cylinders where the length is large compared to the diameter, the limit of high stratification parameter, where the gas is restricted to a thin layer near the wall of the cylinder, and it assumes that there is no mass difference in the two species while calculating the secondary flow. There are two objectives of the present analysis of the rarefied gas flow in a rotating cylinder. The first is to remove the restriction of high stratification parameter, and to generalize the solutions to low rotation speeds where the stratification parameter may be O (1), and to apply for dissimilar gases considering the difference in molecular mass of the two species. Secondly, we would like to compare the predictions with molecular simulations based on the direct simulation Monte Carlo (DSMC) method for rarefied gas flows, in order to quantify the errors resulting from the approximations at different aspect ratios, Reynolds number and stratification parameter. In this study, we have obtained analytical and numerical solutions for the secondary flows generated at the cylinder curved surface and at the end-caps due to linear wall temperature gradient and external gas inflow/outflow at the axis of the cylinder. The effect of sources of mass, momentum and energy within the flow domain are also analyzed. The results of the analytical solutions are compared with the results of DSMC simulations for three types of forcing, a wall temperature gradient, inflow/outflow of gas along the axis, and mass/momentum input due to inserts within the flow. The comparison reveals that the boundary conditions in the simulations and analysis have to be matched with care. The commonly used diffuse reflection boundary conditions at solid walls in DSMC simulations result in a non-zero slip velocity as well as a temperature slip (gas temperature at the wall is different from wall temperature). These have to be incorporated in the analysis in order to make quantitative predictions. In the case of mass/momentum/energy sources within the flow, it is necessary to ensure that the homogeneous boundary conditions are accurately satisfied in the simulations. When these precautions are taken, there is excellent agreement between analysis and simulations, to within 10 %, even when the stratification parameter is as low as 0.707, the Reynolds number is as low as 100 and the aspect ratio (length/diameter) of the cylinder is as low as 2, and the secondary flow velocity is as high as 0.2 times the maximum base flow velocity.

Keywords: rotating flows, generalized onsager and carrier-Maslen model, DSMC simulations, rarefied gas flow

Procedia PDF Downloads 398
200 Adapting an Accurate Reverse-time Migration Method to USCT Imaging

Authors: Brayden Mi

Abstract:

Reverse time migration has been widely used in the Petroleum exploration industry to reveal subsurface images and to detect rock and fluid properties since the early 1980s. The seismic technology involves the construction of a velocity model through interpretive model construction, seismic tomography, or full waveform inversion, and the application of the reverse-time propagation of acquired seismic data and the original wavelet used in the acquisition. The methodology has matured from 2D, simple media to present-day to handle full 3D imaging challenges in extremely complex geological conditions. Conventional Ultrasound computed tomography (USCT) utilize travel-time-inversion to reconstruct the velocity structure of an organ. With the velocity structure, USCT data can be migrated with the “bend-ray” method, also known as migration. Its seismic application counterpart is called Kirchhoff depth migration, in which the source of reflective energy is traced by ray-tracing and summed to produce a subsurface image. It is well known that ray-tracing-based migration has severe limitations in strongly heterogeneous media and irregular acquisition geometries. Reverse time migration (RTM), on the other hand, fully accounts for the wave phenomena, including multiple arrives and turning rays due to complex velocity structure. It has the capability to fully reconstruct the image detectable in its acquisition aperture. The RTM algorithms typically require a rather accurate velocity model and demand high computing powers, and may not be applicable to real-time imaging as normally required in day-to-day medical operations. However, with the improvement of computing technology, such a computational bottleneck may not present a challenge in the near future. The present-day (RTM) algorithms are typically implemented from a flat datum for the seismic industry. It can be modified to accommodate any acquisition geometry and aperture, as long as sufficient illumination is provided. Such flexibility of RTM can be conveniently implemented for the application in USCT imaging if the spatial coordinates of the transmitters and receivers are known and enough data is collected to provide full illumination. This paper proposes an implementation of a full 3D RTM algorithm for USCT imaging to produce an accurate 3D acoustic image based on the Phase-shift-plus-interpolation (PSPI) method for wavefield extrapolation. In this method, each acquired data set (shot) is propagated back in time, and a known ultrasound wavelet is propagated forward in time, with PSPI wavefield extrapolation and a piece-wise constant velocity model of the organ (breast). The imaging condition is then applied to produce a partial image. Although each image is subject to the limitation of its own illumination aperture, the stack of multiple partial images will produce a full image of the organ, with a much-reduced noise level if compared with individual partial images.

Keywords: illumination, reverse time migration (RTM), ultrasound computed tomography (USCT), wavefield extrapolation

Procedia PDF Downloads 74
199 New Methodology for Monitoring Alcoholic Fermentation Processes Using Refractometry

Authors: Boukhiar Aissa, Iguergaziz Nadia, Halladj Fatima, Lamrani Yasmina, Benamara Salem

Abstract:

Determining the alcohol content in alcoholic fermentation bioprocess has a great importance. In fact, it is a key indicator for monitoring this fermentation bioprocess. Several methodologies (chemical, spectrophotometric, chromatographic...) are used to the determination of this parameter. However, these techniques are very long and require: rigorous preparations, sometimes dangerous chemical reagents, and/or expensive equipment. In the present study, the date juice is used as a substrate of alcoholic fermentation. The extracted juice undergoes an alcoholic fermentation by Saccharomyces cerevisiae. The study of the possible use of refractometry as a sole means for the in situ control of this process revealed a good correlation (R2 = 0.98) between initial and final ° Brix: ° Brix f = 0.377× ° Brixi. In addition, we verified the relationship between the variation in final and initial ° Brix (Δ ° Brix) and alcoholic rate produced (A exp): CΔ° Brix / A exp = 1.1. This allows the tracing of abacus isoresponses that permit to determine the alcoholic and residual sugar rates with a mean relative error (MRE) of 5.35%.

Keywords: refractometry, alcohol, residual sugar, fermentation, brix, date, juice

Procedia PDF Downloads 478
198 Towards the Design of Gripper Independent of Substrate Surface Structures

Authors: Annika Schmidt, Ausama Hadi Ahmed, Carlo Menon

Abstract:

End effectors for robotic systems are becoming more and more advanced, resulting in a growing variety of gripping tasks. However, most grippers are application specific. This paper presents a gripper that interacts with an object’s surface rather than being dependent on a defined shape or size. For this purpose, ingressive and astrictive features are combined to achieve the desired gripping capabilities. The developed prototype is tested on a variety of surfaces with different hardness and roughness properties. The results show that the gripping mechanism works on all of the tested surfaces. The influence of the material properties on the amount of the supported load is also studied and the efficiency is discussed.

Keywords: claw, dry adhesion, insects, material properties

Procedia PDF Downloads 359
197 Hospitality Genealogy: Tracing the Ethics and Ontologies of Hospitality-Making on the Silk-Routes

Authors: Neil Michael Walsh, Angelique Lombarts

Abstract:

The authors propose that hospitality is ‘made’ (constituted and performed) in the encounters on the Silk-Routes. Inspired with an initial Derridean perspective on hospitality (the conditional/unconditional) and methodologically underpinned with a Delueuzian relational-rhizomatic approach, the authors contend that hospitality is (re)produced in the encounters of self/other, east/west (among others). Thus, in the spirit of performativity and using the temporal-spatial conduit of the Silk Routes (the sites of ethical, cultural, economic, and material interaction of such exchange), the authors concur that hospitality is produced at the moment in which it is performed. Key themes engaged as units of analysis become welcome, reception, hostility, (and so on) which the authors engage and examine –as they unfold- in the narratives and accounts and material legacies of those who travelled the Silk Routes between the 2nd and 18th Centuries. The preliminary results suggest that these earlier performative moments in hospitality-making on the silk routes continue to resonate and ‘form’ the hospitalities of today. Indeed, these acts of hospitality continue to reconstitute and are never a final state of affairs.

Keywords: hospitality-genealogy, interactions, hospitality-making, Silk-Routes, rhizome, relationality

Procedia PDF Downloads 134
196 The Journey of a Malicious HTTP Request

Authors: M. Mansouri, P. Jaklitsch, E. Teiniker

Abstract:

SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.

Keywords: Linux system calls, web attack detection, interception, SQL

Procedia PDF Downloads 359
195 Comparative Study of Outcome of Patients with Wilms Tumor Treated with Upfront Chemotherapy and Upfront Surgery in Alexandria University Hospitals

Authors: Golson Mohamed, Yasmine Gamasy, Khaled EL-Khatib, Anas Al-Natour, Shady Fadel, Haytham Rashwan, Haytham Badawy, Nadia Farghaly

Abstract:

Introduction: Wilm's tumor is the most common malignant renal tumor in children. Much progress has been made in the management of patients with this malignancy over the last 3 decades. Today treatments are based on several trials and studies conducted by the International Society of Pediatric Oncology (SIOP) in Europe and National Wilm's Tumor Study Group (NWTS) in the USA. It is necessary for us to understand why do we follow either of the protocols, NWTS which follows the upfront surgery principle or the SIOP which follows the upfront chemotherapy principle in all stages of the disease. Objective: The aim of is to assess outcome in patients treated with preoperative chemotherapy and patients treated with upfront surgery to compare their effect on overall survival. Study design: to decide which protocol to follow, study was carried out on records for patients aged 1 day to 18 years old suffering from Wilm's tumor who were admitted to Alexandria University Hospital, pediatric oncology, pediatric urology and pediatric surgery departments, with a retrospective survey records from 2010 to 2015, Design and editing of the transfer sheet with a (PRISMA flow study) Preferred Reporting Items for Systematic Reviews and Meta-Analyses. Data were fed to the computer and analyzed using IBM SPSS software package version 20.0. (11) Qualitative data were described using number and percent. Quantitative data were described using Range (minimum and maximum), mean, standard deviation and median. Comparison between different groups regarding categorical variables was tested using Chi-square test. When more than 20% of the cells have expected count less than 5, correction for chi-square was conducted using Fisher’s Exact test or Monte Carlo correction. The distributions of quantitative variables were tested for normality using Kolmogorov-Smirnov test, Shapiro-Wilk test, and D'Agstino test, if it reveals normal data distribution, parametric tests were applied. If the data were abnormally distributed, non-parametric tests were used. For normally distributed data, a comparison between two independent populations was done using independent t-test. For abnormally distributed data, comparison between two independent populations was done using Mann-Whitney test. Significance of the obtained results was judged at the 5% level. Results: A significantly statistical difference was observed for survival between the two studied groups favoring the upfront chemotherapy(86.4%)as compared to the upfront surgery group (59.3%) where P=0.009. As regard complication, 20 cases (74.1%) out of 27 were complicated in the group of patients treated with upfront surgery. Meanwhile, 30 cases (68.2%) out of 44 had complications in patients treated with upfront chemotherapy. Also, the incidence of intraoperative complication (rupture) was less in upfront chemotherapy group as compared to upfront surgery group. Conclusion: Upfront chemotherapy has superiority over upfront surgery.As the patient who started with upfront chemotherapy shown, higher survival rate, less percent in complication, less percent needed for radiotherapy, and less rate in recurrence.

Keywords: Wilm's tumor, renal tumor, chemotherapy, surgery

Procedia PDF Downloads 317
194 Tracing Economic Policies to Ancient Indian Economic Thought

Authors: Satish Y. Deodhar

Abstract:

Science without history is like a man without memory. The colossal history of India stores many ideas on economic ethics and public policy, which have been forgotten in the course of time. This paper is an attempt to bring to the fore contributions from ancient Indian treatises. In this context, the paper briefly summarizes alternative economic ideas such as communism, capitalism, and the holistic approach of ancient Indian writings. Thereafter, the idea of a welfare brick for an individual consisting of three dimensions -Purusharthas, Ashramas, and Varnas is discussed. Given the contours of the welfare brick, the concept of the state, its economic policies, markets, prices, interest rates, and credit are covered next. This is followed by delving into the treatment of land, property rights, guilds, and labour relations. The penultimate section summarises the economic advice offered to the head of a household in the treatise Shukranitisara. Finally, in concluding comments, the relevance of ancient Indian writings for modern times is discussed -both for pedagogy and economic policies.

Keywords: ancient Indian treatises, history of economic thought, science of political economy, Sanskrit

Procedia PDF Downloads 97
193 Unconscious Bias in Judicial Decisions: Legal Genealogy and Disgust in Cases of Private, Adult, Consensual Sexual Acts Leading to Injury

Authors: Susanna Menis

Abstract:

‘Unconscious’ bias is widespread, affecting society on all levels of decision-making and beyond. Placed in the law context, this study will explore the direct effect of the psycho-social and cultural evolution of unconscious bias on how a judicial decision was made. The aim of this study is to contribute to socio-legal scholarship by examining the formation of unconscious bias and its influence on the creation of legal rules that judges believe reflect social solidarity and protect against violence. The study seeks to understand how concepts like criminalization and unlawfulness are constructed by the common law. The study methodology follows two theoretical approaches: historical genealogy and emotions as sociocultural phenomena. Both methods have the ‘tracing back’ of the original formation of a social way of seeing and doing things in common. The significance of this study lies in the importance of reflecting on the ways unconscious bias may be formed; placing judges’ decisions under this spotlight forces us to challenge the status quo, interrogate justice, and seek refinement of the law.

Keywords: legal geneology, emotions, disgust, criminal law

Procedia PDF Downloads 61
192 Tracing Back the Bot Master

Authors: Sneha Leslie

Abstract:

The current situation in the cyber world is that crimes performed by Botnets are increasing and the masterminds (botmaster) are not detectable easily. The botmaster in the botnet compromises the legitimate host machines in the network and make them bots or zombies to initiate the cyber-attacks. This paper will focus on the live detection of the botmaster in the network by using the strong framework 'metasploit', when distributed denial of service (DDOS) attack is performed by the botnet. The affected victim machine will be continuously monitoring its incoming packets. Once the victim machine gets to know about the excessive count of packets from any IP, that particular IP is noted and details of the noted systems are gathered. Using the vulnerabilities present in the zombie machines (already compromised by botmaster), the victim machine will compromise them. By gaining access to the compromised systems, applications are run remotely. By analyzing the incoming packets of the zombies, the victim comes to know the address of the botmaster. This is an effective and a simple system where no specific features of communication protocol are considered.

Keywords: bonet, DDoS attack, network security, detection system, metasploit framework

Procedia PDF Downloads 254
191 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines

Authors: Kamyar Tolouei, Ehsan Moosavi

Abstract:

In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.

Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization

Procedia PDF Downloads 105
190 Structuring the Role of Indonesia's Dilemma Position in ASEAN to Combat Human Trafficking

Authors: Febi Eka Putri, Prabowo Anggorono

Abstract:

Human Trafficking has become a threat in the global phenomenon, including Indonesia as a country adopting democracy to uphold the human rights value. Indonesia is classified as a source of trafficking in persons which dominate by women and children for sexual exploitation and forced labor purposes. In this case, Indonesia has committed to combat trafficking in persons by enacted domestic law to criminalize all types of human trafficking in domestic and international level. Tracing to the efforts, we cannot just simplify it, however, in 2016 Indonesia has placed as a tier 2 country because the government does not fully achieve the minimum standard by U. S. Trafficking Victims Protection Act due to only making efforts as progress. While as a part of ASEAN member, Indonesia has signed ASEAN Human Rights Declaration but when it comes to Human Trafficking issue, there is only few ASEAN member who has ratified ASEAN Convention on Trafficking in Persons, in particular Women and Children such as Singapore, Cambodia, and Thailand. This brings the evidence to structuring the role of Indonesia to combat human trafficking.

Keywords: Indonesia, Association of Southeast Asian Nations (ASEAN), human trafficking, Tier 2 country

Procedia PDF Downloads 354
189 Monocular 3D Person Tracking AIA Demographic Classification and Projective Image Processing

Authors: McClain Thiel

Abstract:

Object detection and localization has historically required two or more sensors due to the loss of information from 3D to 2D space, however, most surveillance systems currently in use in the real world only have one sensor per location. Generally, this consists of a single low-resolution camera positioned above the area under observation (mall, jewelry store, traffic camera). This is not sufficient for robust 3D tracking for applications such as security or more recent relevance, contract tracing. This paper proposes a lightweight system for 3D person tracking that requires no additional hardware, based on compressed object detection convolutional-nets, facial landmark detection, and projective geometry. This approach involves classifying the target into a demographic category and then making assumptions about the relative locations of facial landmarks from the demographic information, and from there using simple projective geometry and known constants to find the target's location in 3D space. Preliminary testing, although severely lacking, suggests reasonable success in 3D tracking under ideal conditions.

Keywords: monocular distancing, computer vision, facial analysis, 3D localization

Procedia PDF Downloads 139
188 The Democratization of 3D Capturing: An Application Investigating Google Tango Potentials

Authors: Carlo Bianchini, Lorenzo Catena

Abstract:

The appearance of 3D scanners and then, more recently, of image-based systems that generate point clouds directly from common digital images have deeply affected the survey process in terms of both capturing and 2D/3D modelling. In this context, low cost and mobile systems are increasingly playing a key role and actually paving the way to the democratization of what in the past was the realm of few specialized technicians and expensive equipment. The application of Google Tango on the ancient church of Santa Maria delle Vigne in Pratica di Mare – Rome presented in this paper is one of these examples.

Keywords: the architectural survey, augmented/mixed/virtual reality, Google Tango project, image-based 3D capturing

Procedia PDF Downloads 148
187 The Proactive Approach of Digital Forensics Methodology against Targeted Attack Malware

Authors: Mohamed Fadzlee Sulaiman, Mohd Zabri Adil Talib, Aswami Fadillah Mohd Ariffin

Abstract:

Each individual organization has their own mechanism to build up cyber defense capability in protecting their information infrastructures from data breaches and cyber espionage. But, we can not deny the possibility of failing to detect and stop cyber attacks especially for those targeting credential information and intellectual property (IP). In this paper, we would like to share the modern approach of effective digital forensic methodology in order to identify the artifacts in tracing the trails of evidence while mitigating the infection from the target machine/s. This proposed approach will suit the digital forensic investigation to be conducted while resuming the business critical operation after mitigating the infection and minimizing the risk from the identified attack to transpire. Therefore, traditional digital forensics methodology has to be improvised to be proactive which not only focusing to discover the root caused and the threat actor but to develop the relevant mitigation plan in order to prevent from the same attack.

Keywords: digital forensic, detection, eradication, targeted attack, malware

Procedia PDF Downloads 275
186 Readout Development of a LGAD-based Hybrid Detector for Microdosimetry (HDM)

Authors: Pierobon Enrico, Missiaggia Marta, Castelluzzo Michele, Tommasino Francesco, Ricci Leonardo, Scifoni Emanuele, Vincezo Monaco, Boscardin Maurizio, La Tessa Chiara

Abstract:

Clinical outcomes collected over the past three decades have suggested that ion therapy has the potential to be a treatment modality superior to conventional radiation for several types of cancer, including recurrences, as well as for other diseases. Although the results have been encouraging, numerous treatment uncertainties remain a major obstacle to the full exploitation of particle radiotherapy. To overcome therapy uncertainties optimizing treatment outcome, the best possible radiation quality description is of paramount importance linking radiation physical dose to biological effects. Microdosimetry was developed as a tool to improve the description of radiation quality. By recording the energy deposition at the micrometric scale (the typical size of a cell nucleus), this approach takes into account the non-deterministic nature of atomic and nuclear processes and creates a direct link between the dose deposited by radiation and the biological effect induced. Microdosimeters measure the spectrum of lineal energy y, defined as the energy deposition in the detector divided by most probable track length travelled by radiation. The latter is provided by the so-called “Mean Chord Length” (MCL) approximation, and it is related to the detector geometry. To improve the characterization of the radiation field quality, we define a new quantity replacing the MCL with the actual particle track length inside the microdosimeter. In order to measure this new quantity, we propose a two-stage detector consisting of a commercial Tissue Equivalent Proportional Counter (TEPC) and 4 layers of Low Gain Avalanche Detectors (LGADs) strips. The TEPC detector records the energy deposition in a region equivalent to 2 um of tissue, while the LGADs are very suitable for particle tracking because of the thickness thinnable down to tens of micrometers and fast response to ionizing radiation. The concept of HDM has been investigated and validated with Monte Carlo simulations. Currently, a dedicated readout is under development. This two stages detector will require two different systems to join complementary information for each event: energy deposition in the TEPC and respective track length recorded by LGADs tracker. This challenge is being addressed by implementing SoC (System on Chip) technology, relying on Field Programmable Gated Arrays (FPGAs) based on the Zynq architecture. TEPC readout consists of three different signal amplification legs and is carried out thanks to 3 ADCs mounted on a FPGA board. LGADs activated strip signal is processed thanks to dedicated chips, and finally, the activated strip is stored relying again on FPGA-based solutions. In this work, we will provide a detailed description of HDM geometry and the SoC solutions that we are implementing for the readout.

Keywords: particle tracking, ion therapy, low gain avalanche diode, tissue equivalent proportional counter, microdosimetry

Procedia PDF Downloads 175
185 A Comparative Study of Cognitive Factors Affecting Social Distancing among Vaccinated and Unvaccinated Filipinos

Authors: Emmanuel Carlo Belara, Albert John Dela Merced, Mark Anthony Dominguez, Diomari Erasga, Jerome Ferrer, Bernard Ombrog

Abstract:

Social distancing errors are a common prevalence between vaccinated and unvaccinated in the Filipino community. This study aims to identify and relate the factors on how they affect our daily lives. Observed factors include memory, attention, anxiety, decision-making, and stress. Upon applying the ergonomic tools and statistical treatment such as t-test and multiple linear regression, stress and attention turned out to have the most impact to the errors of social distancing.

Keywords: vaccinated, unvaccinated, socoal distancing, filipinos

Procedia PDF Downloads 201
184 Propagation of Ultra-High Energy Cosmic Rays through Extragalactic Magnetic Fields: An Exploratory Study of the Distance Amplification from Rectilinear Propagation

Authors: Rubens P. Costa, Marcelo A. Leigui de Oliveira

Abstract:

The comprehension of features on the energy spectra, the chemical compositions, and the origins of Ultra-High Energy Cosmic Rays (UHECRs) - mainly atomic nuclei with energies above ~1.0 EeV (exa-electron volts) - are intrinsically linked to the problem of determining the magnitude of their deflections in cosmic magnetic fields on cosmological scales. In addition, as they propagate from the source to the observer, modifications are expected in their original energy spectra, anisotropy, and the chemical compositions due to interactions with low energy photons and matter. This means that any consistent interpretation of the nature and origin of UHECRs has to include the detailed knowledge of their propagation in a three-dimensional environment, taking into account the magnetic deflections and energy losses. The parameter space range for the magnetic fields in the universe is very large because the field strength and especially their orientation have big uncertainties. Particularly, the strength and morphology of the Extragalactic Magnetic Fields (EGMFs) remain largely unknown, because of the intrinsic difficulty of observing them. Monte Carlo simulations of charged particles traveling through a simulated magnetized universe is the straightforward way to study the influence of extragalactic magnetic fields on UHECRs propagation. However, this brings two major difficulties: an accurate numerical modeling of charged particles diffusion in magnetic fields, and an accurate numerical modeling of the magnetized Universe. Since magnetic fields do not cause energy losses, it is important to impose that the particle tracking method conserve the particle’s total energy and that the energy changes are results of the interactions with background photons only. Hence, special attention should be paid to computational effects. Additionally, because of the number of particles necessary to obtain a relevant statistical sample, the particle tracking method must be computationally efficient. In this work, we present an analysis of the propagation of ultra-high energy charged particles in the intergalactic medium. The EGMFs are considered to be coherent within cells of 1 Mpc (mega parsec) diameter, wherein they have uniform intensities of 1 nG (nano Gauss). Moreover, each cell has its field orientation randomly chosen, and a border region is defined such that at distances beyond 95% of the cell radius from the cell center smooth transitions have been applied in order to avoid discontinuities. The smooth transitions are simulated by weighting the magnetic field orientation by the particle's distance to the two nearby cells. The energy losses have been treated in the continuous approximation parameterizing the mean energy loss per unit path length by the energy loss length. We have shown, for a particle with the typical energy of interest the integration method performance in the relative error of Larmor radius, without energy losses and the relative error of energy. Additionally, we plotted the distance amplification from rectilinear propagation as a function of the traveled distance, particle's magnetic rigidity, without energy losses, and particle's energy, with energy losses, to study the influence of particle's species on these calculations. The results clearly show when it is necessary to use a full three-dimensional simulation.

Keywords: cosmic rays propagation, extragalactic magnetic fields, magnetic deflections, ultra-high energy

Procedia PDF Downloads 127
183 Specific Frequency of Globular Clusters in Different Galaxy Types

Authors: Ahmed H. Abdullah, Pavel Kroupa

Abstract:

Globular clusters (GC) are important objects for tracing the early evolution of a galaxy. We study the correlation between the cluster population and the global properties of the host galaxy. We found that the correlation between cluster population (NGC) and the baryonic mass (Mb) of the host galaxy are best described as 10 −5.6038Mb. In order to understand the origin of the U -shape relation between the GC specific frequency (SN) and Mb (caused by the high value of SN for dwarfs galaxies and giant ellipticals and a minimum SN for intermediate mass galaxies≈ 1010M), we derive a theoretical model for the specific frequency (SNth). The theoretical model for SNth is based on the slope of the power-law embedded cluster mass function (β) and different time scale (Δt) of the forming galaxy. Our results show a good agreement between the observation and the model at a certain β and Δt. The model seems able to reproduce higher value of SNth of β = 1.5 at the midst formation time scale.

Keywords: galaxies: dwarf, globular cluster: specific frequency, number of globular clusters, formation time scale

Procedia PDF Downloads 326
182 An Infrared Inorganic Scintillating Detector Applied in Radiation Therapy

Authors: Sree Bash Chandra Debnath, Didier Tonneau, Carole Fauquet, Agnes Tallet, Julien Darreon

Abstract:

Purpose: Inorganic scintillating dosimetry is the most recent promising technique to solve several dosimetric issues and provide quality assurance in radiation therapy. Despite several advantages, the major issue of using scintillating detectors is the Cerenkov effect, typically induced in the visible emission range. In this context, the purpose of this research work is to evaluate the performance of a novel infrared inorganic scintillator detector (IR-ISD) in the radiation therapy treatment to ensure Cerenkov free signal and the best matches between the delivered and prescribed doses during treatment. Methods: A simple and small-scale infrared inorganic scintillating detector of 100 µm diameter with a sensitive scintillating volume of 2x10-6 mm3 was developed. A prototype of the dose verification system has been introduced based on PTIR1470/F (provided by Phosphor Technology®) material used in the proposed novel IR-ISD. The detector was tested on an Elekta LINAC system tuned at 6 MV/15MV and a brachytherapy source (Ir-192) used in the patient treatment protocol. The associated dose rate was measured in count rate (photons/s) using a highly sensitive photon counter (sensitivity ~20ph/s). Overall measurements were performed in IBATM water tank phantoms by following international Technical Reports series recommendations (TRS 381) for radiotherapy and TG43U1 recommendations for brachytherapy. The performance of the detector was tested through several dosimetric parameters such as PDD, beam profiling, Cerenkov measurement, dose linearity, dose rate linearity repeatability, and scintillator stability. Finally, a comparative study is also shown using a reference microdiamond dosimeter, Monte-Carlo (MC) simulation, and data from recent literature. Results: This study is highlighting the complete removal of the Cerenkov effect especially for small field radiation beam characterization. The detector provides an entire linear response with the dose in the 4cGy to 800 cGy range, independently of the field size selected from 5 x 5 cm² down to 0.5 x 0.5 cm². A perfect repeatability (0.2 % variation from average) with day-to-day reproducibility (0.3% variation) was observed. Measurements demonstrated that ISD has superlinear behavior with dose rate (R2=1) varying from 50 cGy/s to 1000 cGy/s. PDD profiles obtained in water present identical behavior with a build-up maximum depth dose at 15 mm for different small fields irradiation. A low dimension of 0.5 x 0.5 cm² field profiles have been characterized, and the field cross profile presents a Gaussian-like shape. The standard deviation (1σ) of the scintillating signal remains within 0.02% while having a very low convolution effect, thanks to lower sensitive volume. Finally, during brachytherapy, a comparison with MC simulations shows that considering energy dependency, measurement agrees within 0.8% till 0.2 cm source to detector distance. Conclusion: The proposed scintillating detector in this study shows no- Cerenkov radiation and efficient performance for several radiation therapy measurement parameters. Therefore, it is anticipated that the IR-ISD system can be promoted to validate with direct clinical investigations, such as appropriate dose verification and quality control in the Treatment Planning System (TPS).

Keywords: IR-Scintillating detector, dose measurement, micro-scintillators, Cerenkov effect

Procedia PDF Downloads 182
181 Measurement of Convective Heat Transfer from a Vertical Flat Plate Using Mach-Zehnder Interferometer with Wedge Fringe Setting

Authors: Divya Haridas, C. B. Sobhan

Abstract:

Laser interferometric methods have been utilized for the measurement of natural convection heat transfer from a heated vertical flat plate, in the investigation presented here. The study mainly aims at comparing two different fringe orientations in the wedge fringe setting of Mach-Zehnder interferometer (MZI), used for the measurements. The interference fringes are set in horizontal and vertical orientations with respect to the heated surface, and two different fringe analysis methods, namely the stepping method and the method proposed by Naylor and Duarte, are used to obtain the heat transfer coefficients. The experimental system is benchmarked with theoretical results, thus validating its reliability in heat transfer measurements. The interference fringe patterns are analyzed digitally using MATLAB 7 and MOTIC Plus softwares, which ensure improved efficiency in fringe analysis, hence reducing the errors associated with conventional fringe tracing. The work also discuss the relative merits and limitations of the two methods used.

Keywords: Mach-Zehnder interferometer (MZI), natural convection, Naylor method, Vertical Flat Plate

Procedia PDF Downloads 364
180 Economic Analysis of a Carbon Abatement Technology

Authors: Hameed Rukayat Opeyemi, Pericles Pilidis Pagone Emmanuele, Agbadede Roupa, Allison Isaiah

Abstract:

Climate change represents one of the single most challenging problems facing the world today. According to the National Oceanic and Administrative Association, Atmospheric temperature rose almost 25% since 1958, Artic sea ice has shrunk 40% since 1959 and global sea levels have risen more than 5.5cm since 1990. Power plants are the major culprits of GHG emission to the atmosphere. Several technologies have been proposed to reduce the amount of GHG emitted to the atmosphere from power plant, one of which is the less researched Advanced zero-emission power plant. The advanced zero emission power plants make use of mixed conductive membrane (MCM) reactor also known as oxygen transfer membrane (OTM) for oxygen transfer. The MCM employs membrane separation process. The membrane separation process was first introduced in 1899 when Walter Hermann Nernst investigated electric current between metals and solutions. He found that when a dense ceramic is heated, the current of oxygen molecules move through it. In the bid to curb the amount of GHG emitted to the atmosphere, the membrane separation process was applied to the field of power engineering in the low carbon cycle known as the Advanced zero emission power plant (AZEP cycle). The AZEP cycle was originally invented by Norsk Hydro, Norway and ABB Alstom power (now known as Demag Delaval Industrial turbomachinery AB), Sweden. The AZEP drew a lot of attention because its ability to capture ~100% CO2 and also boasts of about 30-50% cost reduction compared to other carbon abatement technologies, the penalty in efficiency is also not as much as its counterparts and crowns it with almost zero NOx emissions due to very low nitrogen concentrations in the working fluid. The advanced zero emission power plants differ from a conventional gas turbine in the sense that its combustor is substituted with the mixed conductive membrane (MCM-reactor). The MCM-reactor is made up of the combustor, low-temperature heat exchanger LTHX (referred to by some authors as air preheater the mixed conductive membrane responsible for oxygen transfer and the high-temperature heat exchanger and in some layouts, the bleed gas heat exchanger. Air is taken in by the compressor and compressed to a temperature of about 723 Kelvin and pressure of 2 Mega-Pascals. The membrane area needed for oxygen transfer is reduced by increasing the temperature of 90% of the air using the LTHX; the temperature is also increased to facilitate oxygen transfer through the membrane. The air stream enters the LTHX through the transition duct leading to inlet of the LTHX. The temperature of the air stream is then increased to about 1150 K depending on the design point specification of the plant and the efficiency of the heat exchanging system. The amount of oxygen transported through the membrane is directly proportional to the temperature of air going through the membrane. The AZEP cycle was developed using the Fortran software and economic analysis was conducted using excel and Matlab followed by optimization case study. The Simple bleed gas heat exchange layout (100 % CO2 capture), Bleed gas heat exchanger layout with flue gas turbine (100 % CO2 capture), Pre-expansion reheating layout (Sequential burning layout)–AZEP 85% (85% CO2 capture) and Pre-expansion reheating layout (Sequential burning layout) with flue gas turbine–AZEP 85% (85% CO2 capture). This paper discusses monte carlo risk analysis of four possible layouts of the AZEP cycle.

Keywords: gas turbine, global warming, green house gas, fossil fuel power plants

Procedia PDF Downloads 397