Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Disallow setting a scrape address as the staking address in the QT #365

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open

Disallow setting a scrape address as the staking address in the QT #365

wants to merge 1 commit into from

Conversation

IngCr3at1on
Copy link
Contributor

This address #364

@jwrb
Copy link

jwrb commented Dec 7, 2015

Tested and working! It just reverts the address back to what it was.

Do you think we need some kind of error for this to inform the user that they cannot use the main address as the scrape address?

@IngCr3at1on
Copy link
Contributor Author

@jwrb that's a good question. Personally I can't imagine why anyone would ever try to scrape to their own address. I wonder what others think.

@IngCr3at1on
Copy link
Contributor Author

@mdpfeiffer

@dbroyhill
Copy link

I set this once just to see what happened. There was no difference compared to not setting a scrape.

If there was a way to keep stake outputs from splitting, it might be desirable to separate the stake reward but leave it at the same address, for accounting reasons.

It's really not something I would spend any time on... as long as it doesn't break anything, who cares right now?

@IngCr3at1on
Copy link
Contributor Author

@dbroyhill see my most recent comment on #210

@dbroyhill
Copy link

My conclusion is: either it should scrape as desired (same address or not), or there should be an error message.

@IngCr3at1on
Copy link
Contributor Author

Spent an hour messing with this and nothing I'm doing will actually generate a message of any kind when editing the scrape address in the GUI. It's worth mentioning it doesn't generate a message when an incorrect address is entered either.

Personally I'm for leaving this as the current behavior but it everyone would prefer we simply not block the staking address as being set as the scrape address than I'm willing to go with the consensus.

@dbroyhill
Copy link

So at this time, a message of any kind for scrape errors could potentially require a lot of time with no guarantee. Current behavior is what exactly, ignore scraping to itself?

@IngCr3at1on
Copy link
Contributor Author

@dbroyhill current behavior allows setting to self in the qt but not the daemon. It causes the reward to go to a separate output than the rest of the stake; which since it's all on the same address is a bit pointless for the fact that it will increase byte size and transaction complexity (albeit negligible at best).

Now that you mention it code could probably be put in place to ignore it during the actual transaction creation and the argument of whether to allow it wouldn't really apply lol.

@dbroyhill
Copy link

My preference would be for the reward to always be a separate output, even if the scrape address is set to the staking address. If that's the current behavior, then let's leave it. If the consensus is for preventing such function, then I would be okay with the check during transaction creation to ignore it.

@dbroyhill
Copy link

I successfully tested this on .3.3.0, OS X. I received the mint as a separate output to the staking address.

I like it. It might be helpful in special scenarios for accounting reasons.

In general, I prefer to err on the side of keeping unintended "features," though some would call them bugs, as long as they don't create problems. This feature adds negligible bloat to the chain - no more than a regular scrape - and doesn't cause any harm.

But again if consensus is opposed to such function, I'm okay with the transaction creation check.

@IngCr3at1on
Copy link
Contributor Author

I have no intention of directly vetoing anyone in this respect so I'd really like more feedback before we decide the best way to go with this. @CeForce you see the support tickets, what are your thoughts?

@CeForce
Copy link

CeForce commented Feb 18, 2016

I haven't seen anything that makes me feel strongly one way or the other. If nothing is broken I'd say leave it alone, and as is. We haven't had too many tickets related to scrapes, if any.

If it's disallowed without an error or some sort of feedback to the user I think it will create confusion

@IngCr3at1on
Copy link
Contributor Author

In that case the only real question is do we leave the daemon as it is or go ahead and remove the block on the daemon (it does return an error message but disallowing it on the daemon and no the qt is inconsistent).

@AtomSmasher00
Copy link

As no one is strongly opposed to self scrapes, let's remove the check on the daemon.

@MitchellCash
Copy link
Contributor

In some ways I think we should block it with an error message like the daemon. I'm not to bothered though.

But like @IngCr3at1on says whatever we do we need to make it consistent across both qt and daemon.

@IngCr3at1on
Copy link
Contributor Author

@MitchellCash if you want to take a shot at making an error message present go for it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants