Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - N/A

Pages: 1 23 ... 263
1
The Flood / Re: pp
« on: May 19, 2018, 04:34:26 PM »
pp

2
The Flood / Re: Hot takes
« on: February 23, 2018, 09:56:22 PM »
Professors and teachers who try to inject their own personal beliefs into their curriculum rather than attempting to be balanced should be fired. If they have tenure, it should be revoked.
Uh, that’s already the case for primary school teachers. Injecting bias would be a violation of the captive audience doctrine.

3
Serious / Re: The argument against free will under an Abrahamic God
« on: February 19, 2018, 08:29:59 AM »
Hey sexy, how are you
Alright, tired mostly. Finishing up my internship, which requires working all five days a week.

4
Serious / Re: The argument against free will under an Abrahamic God
« on: February 19, 2018, 08:28:35 AM »
Even OP's definition of 'free will' is flawed:
"We can define free will as rational agents choosing their own course of action."
Tbh, I went with a textbook definition there that appeared to be agreed upon for the sake of semantics.

Quote
And ultimately, the scenario of the Abrahamic God is exactly identical to the secular scenario in which the universe exists with initial conditions resulting in symmetrical time and deterministic physics -- except in the former, we have something to shake our fists at.
I personally think that the main argument between the religious and the secular view is that an Abrahamic God, if he existed with the conditions that are believed, has free will. He has the volition to change any constraint and has the foresight to know the outcome. A secular universe could be deterministic as well, but the universe to our knowledge isn't able to rationalize what its constraints impose.

5
Serious / The argument against free will under an Abrahamic God
« on: February 19, 2018, 01:44:40 AM »
The axioms of an Abrahamic god is that he is omnipotent and omnipresent.

Setting up conditional statements from this, it could said...

If God is omnipotent, then he created and controls all.
If God is omnipresent, then he knows all variables and outcomes: past, present, and future.

From this, we could conclude that God created, knows, and controls all.

God created and knows all, so...

If God knows all, then he knows who will sin and receive damnation.
If God created all, then he created everything with knowledge of what the outcome will be.

Quote
We can define free will as rational agents choosing their own course of action.

If free will exists, then one is able to work off their own volition without external controls.
If no external factors control oneself, then actions can not be predetermined by outside sources.

Quote
This is where my question lies, if an Abrahamic God exists, with the axioms that are agreed upon, how can free will exist along with this God?


6
The Flood / Re: whats been up fellas?
« on: January 20, 2018, 02:49:12 PM »
self esteem is at a record high and on the incline. put on 10 lbs in the last month, diet is super fucking clean (no supplements/boosters/drugs + no meat or dairy) and work fucking loves me off. art skills have seen improvement as well. my brother is living back home, fighterz is out next week, DBS is on hiatus soon... things are good. worst part is this online course stuff i gotta do but thats really it.
dude, if you’re going full vegan, you should be supplementing b12. Shit is the hardest to get in a vegan diet.

7
Serious / Re: Algorithms, errors and police
« on: January 16, 2018, 06:14:39 PM »
With the latter quote, I do feel that much of what we are finding is how much in its infancy AI software is in respect to its potential in the future, a lot of what we will find right now is the hiccups in the system of refining them. Especially right now with the fact that AIs can only work in a series of yes and no answers.
This just adds to my point though. AI is still in its infancy (even though it can definitely do more than just provide yes or no answers), which is exactly why these are important things to consider and regulate now rather than letting it grow up without these issues being addressed.
My apologies, I think we were thinking "yes and no" in different ways. I was referencing the fact that computer AI software is limited by the constraints of binary. With quantum computing allowing for superposition and entanglement to be used, AIs may be able to develop even farther into more "human" based learning.

8
Serious / Re: Algorithms, errors and police
« on: January 15, 2018, 09:41:01 PM »
And this can and does go pretty far. You're applying for jobs or colleges. Your profile is checked and scored based on how well you would do. Aside from your own qualifications (degree, experience, traits), you're also scored against a general profile made of you based on similarities and the information they have on you. Your name sounds foreign or Spanish? Shame, but -10 points on language skills because statistically those people are less fluent in English than "Richard Smith" is. You're from area X? Ouch, well that place has some of the highest substance abuse rates in the country so you'll get a -10 on reliability because, statistically speaking, you're more likely to be a drunk or drug addict. You went to this high school? Oof, people from that school tend to have lower graduation rates than the national average so that's a -10 on adequacy. Your parents don't have college degrees? Sucks to be you but it's a fact that children of college educated parents are more likely to score well on university level tests, so -10 on efficiency. That's -40 points on your application based entirely on hard, solid and valid statistics or facts. Perfectly reasonable, no? Only, it aren't facts about you. It's facts about people like you taken on average. And of course, this will hold true for many like you. They won't do as well, they will fail more and they might in the end drop out. But for many, this doesn't ring true. They aren't drunks, they are motivated, they would get good grades and they do speak English well. But in the end, they don't even get the chance to try because the system rejects them. This likely condemns them to worse jobs, a lower education and ultimately an almost guaranteed lower social status, all while people from a "good" area with rich, white parents get more opportunities so that the inequality and the social divide grows while social mobility drops.

Obviously, this is an exaggeration. It doesn't happen now, but it very well could in the not so distant future. As machine learning and AI become more commonplace and powerful, and the amounts of different data they are fed with continues to grow, it becomes increasingly difficult to ascertain exactly what goes on in their "brain". And as these systems are almost always proprietary and owned by companies, there's almost no real way to look into them and find out how they work - especially not if you're just an ordinary person.
Quote
I brought this up to further illustrate my example from earlier. There are some serious disparate and negative effects that can come from, as you put it, "noticing trends" and applying them for decision-making without proper safeguards, oversight and mitigation techniques in place, even when they are based on solid, valid and statistically sound facts. And these things can and really do happen, partially because of how difficult it is to assess these systems and pick out flaws. Remember when Google's image recognition software identified black people as gorillas? After several years, we've finally found out what their "solution" is two days ago. Instead of fixing the acutal algorithm, which is a difficult thing to do even for a company like Google, they just removed gorillas from their labelling software altogether and made it into what's effectively an "advanced image recognition tool - for everything other than gorillas" package.
Regarding the first section, I feel pretty 50/50 about the implications of here. While we can both agree that this isn't fair, the position of a college doing this is understandable. When it comes to compiling data, outliers shouldn't be taken as the norm of a distribution. If a college found two people of equal qualifications, but one came from a background that had a family of drug abusers, I wouldn't chastise the college for choosing the safer of the two bets. However, like you said, this does create the problem of making social mobility easier for people. Generally, this is why safeguards such as affirmative action has been so commonplace.

With the latter quote, I do feel that much of what we are finding is how much in its infancy AI software is in respect to its potential in the future, a lot of what we will find right now is the hiccups in the system of refining them. Especially right now with the fact that AIs can only work in a series of yes and no answers. I also agree that the fact that the proprietary nature of the AIs being developed is another big problem. Due to the fact that the software is closed off to the general populace, refining AIs to account for certain variables or forking the software to find out more efficient versions isn't possible.

9
Serious / Re: Algorithms, errors and police
« on: January 14, 2018, 09:53:11 PM »
Fuck the quadpost / monologue, but I'm going to take some of the stuff I wrote here and reuse it for an article I'm working on. Thanks for the help Zen.
No problem. Sorry for the late replies at times as well. I’ve been busy with getting school stuff together. I would agree with you  that we don’t have opposing views, I just don’t think that the severity of the implications are as strong as you make it sound out to be. Not to mention, with quantum computing making so many break throughs, I would imagine that AIs ability to differentiate between correlations and causations will increase, making the fear of generalizations causing racial biases far less of a concern. I’ll be posting a more in depth response either tonight or tomorrow morning, playing games in a call right now, sorry.

10
Serious / Re: Algorithms, errors and police
« on: January 13, 2018, 04:56:17 PM »
I'll probably finish the rest tomorrow in a shorter version.
Alright. I’ve read both your posts btw. I’m just stuck on mobile in town right now. So when I’m able to, I’ll edit this post with my response.

Edit: So, alright to address this part.
Quote
Poor people are more involved in crime > more police focus on poor areas, less on richer areas > more poor people arrested and convicted > even more evidence that the poor have criminal propensity > even more focus on the poor (reinforcing the “prison pipeline” and putting more of them in prison, which we know teaches criminal habits and make them less likely to be employed afterwards so they remain poor) and even more sentences and harsh judgments > system grows increasingly more “biased” against poor people because of the feedback loop (it’s being proven “right” because more poor people are going to jail because of it) > the current problems of inequality persist, the underlying problem goes unaddressed, minority communities are further ostracized, the rich/privileged are given more “passes” while the poor/disadvantaged are given less leeway and more punishments > social mobility is stifled and the divide between the rich and poor grows because institutionalized computer systems serve as an added obstacle…

And all of this happens on the basis of cold, hard and factual data paired with a very smart computer. This is just one of the dozens of possible scenarios, but I hope that this clarifies what I meant. Data is not necessarily wrong or inherently bad, even when it’s “biased”. The point is that technology can pick up on these inequalities / problems / different treatments and actually reinforce them further because it considers them the norm. The risk is that algorithms learn from data, create generalized (and potentially “prejudiced” or “biased) profiles, and then apply them to individuals (“your father was abusive which means that you’re more likely to be abusive too so you’ll be denied a job since you’re considered a potential abuser regardless of the person you are”) which suffer as a consequence but have almost no way to fight back because their disparate treatment is (often wrongfully) legitimized as “well the computer says so and it’s an intelligent piece of technology so it’s neutral and always objective”.
I understand what you are saying about the feedback loops. However, to address that concern, this information should not be treated as the same data set, rather a subset of that data set. Like I discussed from my previous post, when an area is provided with an more intensive treatment for the purpose of remedying the difference between that area and the norm, the data should be used for analysis of how the area is improving over time. Think about it like a science experiment, the area is receiving a new variable into its equation (the increased police presence). Treating the area like the other areas is what supports that feedback loop.

However, I do acknowledge the potential that feedback loops will also be a problem regardless. Like I discussed before with the AIs conversing with one another, AIs are sufficient at simplifying data. Despite this though, I doubt that anytime in future we will solely rely on AIs.   

11
The Flood / Re: Why aren't you a Communist?
« on: January 11, 2018, 11:02:27 PM »
Dude, I give communists free helicopter rides.

12
Serious / Re: Algorithms, errors and police
« on: January 11, 2018, 06:28:43 PM »
1. Statistics are easily manipulated and interpreted in different ways. They're a guideline but insufficient to dictate policy on their own and prone to misuse. The real problems arise when big data is used not just to provide information but to make decisions affecting individual people.
Which is why you can't just explicitly blame the data for noting an area that it marks as an outlier in regards to crime rate. As I said in my initial post. There are a variety of reasons as to why areas that have a higher minority presence tend to have a higher crime rate. Education, opportunities, and the wealth distribution are all factors which affect one another and affect crime rate as well.  For example, the difference between the amount of words known by a five year old is three times higher in regards to children growing up in the highest socioeconomic level than it is to those growing up in the lowest socioeconomic bracket. Other factors to consider which also affect education that correlates with economic status is the homes in which the children grow up in, which research has found more homes in those areas that used lead based paint for their homes. This in turn negatively affects cognitive development. Generally, by fourth grade, the educational disadvantage is so great between socioeconomic groups that providing support to the lowest end students is not effective. This in turn leads to higher high school drop out rates, leads to less options for work, and leads to more people turning to drug use and/or crime related activities. And guess what, those generally in the lowest socioeconomic bracket tend to be minorities.
Quote
Imagine you're a man looking to become a teacher. The school district employs an algorithm to assess job applications and potential candidates. This system takes into account dozens of characteristics and data points to evaluate your profile. One of the things it learns from the data it's trained with is that men make up the vast majority of sex offenders and are responsible for almost all cases of teachers sexually or physically abusing students. As a result, it ties men to these crimes and incorporates this into its decisionmaking. Every man, by default, gets a point deduction because they're a higher risk profile and will systematically be hired less. This goes for a dozen different things. Say you're applying for a college. Its algorithm determines that people from your state / area / region / background tend to drop out more often than the average. Since every student is an investment, colleges want successful ones. As such, your name is by default put at the bottom of the list despite no person at the school having met you or being able to assess you on your merits alone. The same thing applies here. All of this, as you say, is based on accurate, real and reliable facts that notice trends in our society, yet I think you're going to have to agree that it's far from fair.
I mean, you're talking to a male who is on his last semester of his teaching degree. I fully understand what you are talking about there. iirc, the percentage of female teachers is roughly 80% in comparison to 20% male. However, making a generalization like the one you mentioned is not bad. I would not say that it is fair, but it is an understandable factor to take in into place for the person who is acting in loco parentis for 20 to 30 children. Regardless, even with the systematic bias towards male teachers, I can say from my own personal anecdotes that metric has not as great of an effect as you think. I'm basically guaranteed a job teaching as soon as I receive my bachelors come the end of this semester.  I have had school districts all across the state send me emails looking for teachers with my qualifications.

Other factors that largely affect the teaching climate is the attitudes towards teaching in the U.S. between the different sexes. Females in general tend to gravitate towards teaching, and females especially aim for teaching elementary. I shit you not when I say that in my entire time that I have attended my college, I have met only one male teacher aiming for an elementary teaching degree. Most males you will meet tend to get their 5-9 or 6-12 and generally gravitate towards teaching mathematics or history in my experience.
Quote
2. These algorithms exacerbate existing problems and biases by creating a feedback loop. Say the system identifies an area or a specific group that has a lot of issues with crime. As a result, the police focuses its attention there and deploys more cops with a specific mandate. You will see that even more crime is now recorded in this area, merely due to the fact that there's simply more cops that are actively looking for it. There's not any more crime that there was before, but it's just noticed more often. You then put this new data in the system and voila - feedback loop. "The computer was correct, we listened to it and caught more criminals who are black". This can lead to adverse effects and the over / under policing of certain areas. The attributes you've found serve as a proxy for race and rather than fairly policing anything, you're now effectively policing people based on the color of their skin.
One of the primary problems with what you stated here is that you isolated an area that was having an higher crime rate and focused on remedying this area. However, instead of creating a new data set for that isolated area to measure the trends and patterns that arise from an increased police presence, in order to note if the increased presence is lowering the crime rate over time, you are instead putting that information back into the original set. If a set mandate and increased police presence is implemented in an area marked as a high crime area, that area's data should be looked at individually to note how the program is working. One would expect that at the start of such a program, the crime rate would go upwards because the police are catching more activities that flew under the radar beforehand. But, as the project stabilizes, it should be expected that the crime rate will go down over time because of the fact that the increased presence has been more meticulous in its analysis of the area. 
Quote
Take the policies like stop and frisk or random traffic stops. There's been a lot of research into this finding substantial racial bias in how they were executed. If you now use that date to train a computer to determine who should be "randomly" stopped, you'll find that it also focuses more on blacks. Aside from the problem of how this affects innocent individuals (see 1.), simply by focusing more on blacks, you'll now find more criminals among them. That's basic logic. Feed this back into the system and you'll end up with a situation where whites are given a pass or stopped less and less based on the assumption that they're less likely to be criminals, but this assumption is already based on previous data (analysis) and can therefore exacerbate the issues and bias. This can lead to the underlynig problem being ignored and existing problems being continued rather than fixed.
This part I agree with you could be a problem. I understand that what you are getting at here. Personally, the only thing I can say about this is that in a case such as this, computers should use methods of random sampling to determine who should be stopped. This way a sample representative of the population is obtained. The only caveat of such a implementation, is that the computers are not able to effectively use crime analysis data in this situation. This removes the fear of racial bias, but may also in turn lead to other problems such as the fact that random sampling methods may not catch as many crimes.
Quote
3. You now also institutionalize the problem. It's easy to check a person and have them justify certain actions in order to determine if they're prejudiced or wrong, but it's a lot harder with a very intelligent computer. People take what technology says for granted and trust that it's neutral, fair and accurate, while it very often isn't. The more we move towards machine learning, the more we run the risk of incorporating these issues that are potentially extremely difficult to detect. A famous example is that of image recognition software distinguishing between different animals. An AI was trained to do this and it became extremely good at it with very little effort. So good that the people who created it became skeptical. Want to take a guess what they found when they really put it to the test? I'll comment later.
My guess is that the computer created generalizations that simplified the process of identifying animals in the quickest way possible. If I am correct, such a thing reminds me of the discussion of how AIs talking to one another "created" a new language by simplifying the syntax of English.


Sorry for the late reply by the way, I just got home recently, started writing, and got logged off by accident because I forgot to click the keep me logged on button.

13
Serious / Re: Algorithms, errors and police
« on: January 10, 2018, 09:07:21 PM »
>minority communities tend to be less affluent, not as well educated, and does not provide as much of a chance of social upward mobility
>minority communities as a result have a higher crime rate
>crime data being taken in acknowledges those areas are committing an disproportionate amount of crime in comparison to other areas
...
>so this means the technology is biased for noticing trends that occur among different racial communities

Really?

14
are you still unironically catholic or did you sober up?

>what is orthodoxy
Heretics that don’t believe in the rule of Peter and his successors as the head of the church.

15
The Flood / Re: so can anyone explain the cuckmouth pose?
« on: January 02, 2018, 11:51:29 AM »
God, you can see the jewness of the first pic so much

16
The Flood / Re: got any plans this #Doormas
« on: December 20, 2017, 03:59:07 PM »
Sucks that Door was shot down while on a pilgrimage to Jerusalem.

17
The Flood / Re: Star Wars: The Last Jedi
« on: December 14, 2017, 08:33:51 PM »

*destroys your ancient scrolls*
"I know the ancient Jedi scrolls aren't exactly exciting reads Luke, but you know what is? The Turner Diaries. Before I read that book, I thought that upjumped apes like Master Windu could be considered equal to people. How foolish I was, Luke. Regardless, he harnessed the negroid's penchant for violence well. Why, he almost fried the face off of the man who currently rules the universe. And he was a good pet."

18
The Flood / Re: An important announcement
« on: December 13, 2017, 02:37:43 PM »
It's fun and games but if Deci was white he would've shot up his school.

Amen.
Armen*

19
The Flood / Re: Deci is fat
« on: December 11, 2017, 05:14:55 PM »
Daily reminder, fat people deserve death.

t. Former fatty

20
The Flood / Re: Ohhhhh shit son got followed out of the store today
« on: December 08, 2017, 07:09:02 PM »
They borderline had me convinced to cut my losses and admit my wrongdoing, thank god i remembered the "only cops can follow you off the property" rule

So I just kept walking and kept denying and here I am

I think my life of crime is over that was scary
>implying everyone even cares about that rule

When I worked at Dunkin, we had one of the heroin junkies try to steal from our tip jar. My managers followed his ass to the next two stores over, grabbed the shit from his hands, and threatened the everloving fuck out of him.
worst case scenario I had to get to the bridge, no way were those guys following me onto the train tracks in normal weather -- let alone this crazy af snow
Whatever, keep doing the shit. Just don't be surprised when people work there smack your shit up and then call the cops on you.

21
The Flood / Re: Ohhhhh shit son got followed out of the store today
« on: December 08, 2017, 07:03:40 PM »
They borderline had me convinced to cut my losses and admit my wrongdoing, thank god i remembered the "only cops can follow you off the property" rule

So I just kept walking and kept denying and here I am

I think my life of crime is over that was scary
>implying everyone even cares about that rule

When I worked at Dunkin, we had one of the heroin junkies try to steal from our tip jar. My managers followed his ass to the next two stores over, grabbed the shit from his hands, and threatened the everloving fuck out of him.

22
Gaming / Re: Destiny 2 General
« on: December 07, 2017, 01:02:50 AM »
For anyone here who bought the base game but not the dlc.

https://www.reddit.com/r/DestinyTheGame/comments/7i1l2n/you_can_get_sonyms_to_refund_your_purchase_thanks/

Basically, since the dlc made the prestige raid inaccessible, getting the achievement/trophy is behind a paywall even though the achievement was for the base game. So, you could get a full refund.

23
meanwhile im bitching about IRAF and unix data reduction for a binary star that couldnt be observed all night. good on you
thanks

24
in four weeks i attended one wedding and did nothing productive

whats it like to be productive
sucks ass tbh

25
Weren't you the pedophile
It's been like what, four years, and people still don't understand that was a shitposting persona?

26
Going for that MD?
yeah that’s the crazy part, these classes are basically my last classes of my bachelors. The assignments they hand out are insane.

27
It`s nice to hear that somebody manages to make such a big work. Good job.
Thanks

28
After all of that, would you like to sleep for a week?
Tryinng to right now. Been laying down for the last hour, can’t pass out though.

29
That’s pretty impressive

Good job
Thanks. So much of those papers were straight bull but I still aced the ones graded so far. 100% on the 25 page one, 97% on the 30 page one, and just submitted the 47 page one a couple of hours ago, so gotta wait on that.

30
oh, that doesn't seem as bad
Try being a mathematics teacher trying to administer reading strategies for assessment. I had to literally grasp at straws the majority of the time. Class had no relevance whatsoever, but was required.

Pages: 1 23 ... 263